Using pypa's build on a python project leads to a generic "none-any.whl" wheel, but the package has OS-specific binaries (cython)

Question:

I am trying to build a package for distribution which has cython code that I would like to compile into binaries before uploading to PyPI. To do this I am using pypa’s build,

python -m build

in the project’s root directory. This cythonizes the code and generates the binaries for my system then creates the sdist and wheel in the dist directory. However, the wheel is named "–py3-none-any.whl". When I unzip the .whl I do find the appropriate binaries stored,
(e.g., cycode.cp39-win_amd64.pyd). The problem is I plan to run this in a GitHub workflow where binaries are built for multiple python versions and operating systems. That workflow works fine but overwrites (or causes a duplicate version error) when uploading to PyPI since all of the wheels from the various OS share the same name. Then if I install from PyPI on another OS I get "module can’t be found" errors since the binaries for that OS are not there and, since it was a wheel, the installation did not re-compile the cython files.

I am working with 64-bit Windows, MacOS, and Ubuntu. Python versions 3.8-3.10. And a small set of other packages which are listed below.

Does anyone see what I am doing wrong here? Thanks!

Simplified Package

Tests
Project
    __init__.py
    pycode.py
    cymod
        __init__.py
        _cycode.pyx
_build.py
pyproject.toml

pyproject.toml

[project]
name='Project'
version = '0.1.0'
description = 'My Project'
authors = ...
requires-python = ...
dependencies = ...

[build-system]
requires = [
    'setuptools>=64.0.0',
    'numpy>=1.22',
    'cython>=0.29.30',
    'wheel>=0.38'
]
build-backend = "setuptools.build_meta"

[tool.setuptools]
py-modules = ["_build"]
include-package-data = true
packages = ["Project",
            "Project.cymod"]

[tool.setuptools.cmdclass]
build_py = "_build._build_cy"

_build.py

import os
from setuptools.extension import Extension
from setuptools.command.build_py import build_py as _build_py


class _build_cy(_build_py):

    def run(self):
        self.run_command("build_ext")
        return super().run()

    def initialize_options(self):
        super().initialize_options()
        import numpy as np
        from Cython.Build import cythonize
        print('!-- Cythonizing')
        if self.distribution.ext_modules == None:
            self.distribution.ext_modules = []

        # Add to ext_modules list
        self.distribution.ext_modules.append(
                Extension(
                        'Project.cymod.cycode',
                        sources=[os.path.join('Project', 'cymod', '_cycode.pyx')],
                        include_dirs=[os.path.join('Project', 'cymod'), np.get_include()]
                        )
                )

        # Add cythonize ext_modules
        self.distribution.ext_modules = cythonize(
                self.distribution.ext_modules,
                compiler_directives={'language_level': "3"},
                include_path=['.', np.get_include()]
                )
        print('!-- Finished Cythonizing')
Asked By: Oniow

||

Answers:

This was solved by adding in a nearly empty setup.py file into the root directory alongside pyproject.toml. The rest of the files are the same as the original post.

# setup.py
from setuptools import Extension, setup

extensions = [
    Extension(
       'Project.cymod.cycode',
       sources=[os.path.join('Project', 'cymod', '_cycode.pyx')],
       include_dirs=[os.path.join('Project', 'cymod'), np.get_include()]
       )
    ]


setup(
    ext_modules=extensions
)

OS-specific wheels are now being built rather than broken generics. It looks like this is a limitation of pyproject.toml at the moment.

Answered By: Oniow