-
Notifications
You must be signed in to change notification settings - Fork 42
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
macOS install not using brew installed libraries #439
Comments
Hi! First, I'm the guy who wrote the latest installers and the installer documentation, and based on your post I think you know more about this than me. Just so we're level set. 😀 Probably 99% of the complexity of installing MPF-MC is not related to the mpf-mc package itself, but rather the installation of Kivy. Kivy, in turn interfaces with all the SDL2 and Gstreamer libs and all that. If you look into Kivy, either in their docs, or their forums, or their repos, you'll see there are lots and lots of various complexities and ways to specify library paths, which libs are used, etc. I sorta picked the easiest defaults I could get working, but without 100% knowing what I was doing. So you may want to pick around in the Kivy world to see what you can figure out? Some of the libraries that MPF-MC uses don't matter. e.g. Kivy might support options A, B, C for images, (like SDL2_image, or PIL, or?? and it doesn't really matter to MPF-MC which one you use.. (MPF-MC is just making Kivy calls, and Kivy is handling the rest, so if it works with Kivy it will work with MPF-MC.) Audio is a bit different since MPF-MC has its own custom implementation that uses Gstreamer, so you need Gstreamer for audio, but the rest I think can work with anything? Also if you have any ideas, suggestions, fixes, recommendations, etc. we would love the feedback? I'm not a "real" developer, I just write some Python code to do pinball things, so all these multimedia libraries and installers and most of what you're writing is way over my head. Thanks for these details though and digging in! |
Thank you for being honest that all of this is a best effort... I totally understand, I'm a software engineer who's trying to learn how to build a pinball machine and there's a lot to understand. I very much appreciate that you're all here and offering help. I did get my install working... although I'm not sure yet how you could go about making it better for everyone. Took me a few hours to dive into python
A comparison of what's in the x86_64 .whl files available for download (from
(bitmap_font.cpython-38-darwin.so is referencing local libSDL files, not the ones installed in /usr/local/opt). I had to manually install my .whl file in the mpf virtual environment, but after that I tried to make sense of how the .whl files you produce feel they need to contain prebuilt libraries and I'm not seeing how that happens. I wonder if you're building the macOS images from Windows somehow, using a version of the kivy macOS packager that thinks it needs to include these libraries, or cpython on your machine has issues. Or sunspots... those guys are always suspect. No idea if any of this is helpful, and I'm not sure how you could make this process better (other than making the universal .whl really universal... it's only producing arm64 slices, which is probably this bug in cpython). One package would be easier to maintain than two, in theory. But as it goes, this bug could be closed since it's no longer blocking me, but if you'd like to continue to sort out what's going on here, I'm happy to help track it down. |
Again, thanks for your efforts here. This is all great. Here's some information on the Mac wheel building process(es) which might be helpful? I agree a single universal wheel would be great. So here's how this stuff works now. First, everything is built in GitHub cloud runners. Unfortunately GitHub does not offer ARM-based Mac runners (I think it's on the roadmap for 3Q23.) So the GitHub workflow which builds everything only builds x86 Mac wheels. This is the main file that controls everything which happens on GitHub on checkin: https://github.com/missionpinball/mpf-mc/blob/dev/.github/workflows/build_wheels.yml I think you can kinda figure out what that's doing? It uses the cibuildwheel project which hides some of the complexity, but you can see in lines 62-83 what setup runs on Mac and what the build commands are. And the matrix entries which drive which Mac builds run. Then for ARM Macs, I just manually build those locally using this build script from the repo So if there's any value in there, in updating the local build script I run, (which is prob all we can do for now), I'd love another set of eyes on this. If we can get a real universal build happening locally then I'll disable the cloud builds for x86 mac for now, and we can flip over to using whatever new technique we figure out when GitHub gets ARM Mac runners in the future. |
This makes more sense... I missed the GitHub workflow (wasn't even aware it was a thing, to be honest. TIL). GitHub's docs are pretty good for how this works, which is very nice. Can I ask why you build for three versions of python? Is that supporting different customer needs or do each of the versions do something different? The universal only builds for 3.9 (which is what I would expect). Give me some time and I'll poke at this. I would think that a universal build should be possible, but I'll need to go down the python build system rabbit hole again to be sure. |
OK... I think I understand what's going on now. My original issue is due to the "delocate" process that runs on wheels targeting macOS. Delocate is the moving of dependencies into the wheel and the fix-up of their loader paths so the python library that's being built does not need to have any other dependencies installed on the host system (wheels are supposed to be portable, and this makes them so). Clever, but confusing the first time you encounter it. And in my case, it copied a library that had the symbol that didn't match my version of IOKit because my OS is super old. I guess that ciwheelbuilder runs this delocate step as a matter of course, but I didn't go down that rabbit hole far enough to find out. Instead I worked on figuring out how to build a universal wheel. The issue I referenced before about cpython having issues was out of date (it's been fixed) and pertained to building universal versions of the python framework itself, not wheels. But understanding that issue got me to a page that talked about being able to cross-compile for arm64 from a Github Workflow (apparently ciwheelbuilder supports it; see grpc/grpc#29262 and https://github.com/pietrodn/grpcio-mac-arm-build/blob/main/.github/workflows/wheels.yml). So I cloned the MPF repos, edited the build_wheels.yml file to cross-compile and... neglected to realize until it failed that dependencies on this runner would still be x86_64, not arm64. So while it tries to compile for arm64, you get fun errors like this when compiling against the SDL2 libraries:
I tried forcing brew to install the arm64 libraries on GitHub's builder and it didn't care for that at all (failed immediately, which is not expected). Why grpcio can do it and MPF cannot is that grpcio has no dependencies, it's just building itself. If you wanted to build your entire dependency graph from source, I'm betting you could make this work too... but that way lies madness. I also tried building the universal wheel on my own machine. I installed all the x86_64 libraries via brew and compiling still worked, but Given all the constraints, I think you're actually building your distributions as best you can right now. And even when GitHub can support arm64 builds, I don't think you want to create universal2 binaries. They'd be twice the size (assuming you delocate the dependencies into the wheel) and have no benefit over building per-architecture wheels since python/pip prefers matching architecture wheels over universal ones anyways. There's a couple of things worth pointing out. First, the universal2 wheel you're building isn't universal (it's lacking the proper references to x86_64 libraries). You probably aren't getting complaints because x86_64 macOS users get the x86_64 wheel. But if they were to try the universal version, they'd get errors. You can either stop building the universal variant (which I'm sure you can do, but I'm not sure how. I think which architectures are compiled comes from the Python.framework that is required, and that prefers universal binaries by default) or you can try to fix up the paths to point to the right locations for x86_64. This would be gross, and a waste of time anyways. Or you can ignore this problem entirely, unlikely arm64 users will copy their installs of MPF to x86_64 machines over just re-installing. Second, the universal version isn't "delocated". It's expecting to find dependencies installed on the host machine. This really isn't a problem, since you have us install the dependencies as part of the MPF install process. But it does make MPF less portable and susceptible to libraries changing out underneath it. It's also a behavior change between x86_64 and arm64 versions, which could be a weird support issue to track down later. If you do happen to run delocation on the universal binary, I think you could simplify the install process. If all of your dependencies are included in the wheel, there is no reason to have MPF users install them too. This is why you get the "classes implemented in multiple locations" errors at run-time, the linker is noticing the brew installed libraries and the libraries that are copied into the wheel at build-time. I didn't test this, but assuming both MPF and MPF-MC delocate dependencies, it should work. I am sorry and disappointed I wasn't able to figure out a good way for you to build a universal wheel, but I understand a whole lot more about python library/wheel building process now. Maybe I'll be able to help with something else in the future. For now though, I think we can close this issue and I'll build my own MPF for my old 10.15 machine. Thank you again for the help and the pointers. This was a fun diversion, getting to understand the internals. |
Let me start off with saying I'm trying to get MPF to run on an older build of macOS: 10.15.7. The hardware is older (Late 2012 Mac mini) and is not going to benefit from a newer OS. And I know this is going to be more challenging (I've already fought through some brew issues).
That said, I find myself with a problem that presents itself as the following after installing all the necessary bits. When running the mc_demo machine, I get this error:
Poking at
/Users/pinball/.local/pipx/venvs/mpf/lib/python3.9/site-packages/mpfmc/uix/bitmap_font/../../.dylibs/libSDL2-2.0.0.dylib
I indeed see it has a reference to_kIOMainPortDefault
:This might be the core of the problem... 10.15's IOKit doesn't seem to have a
kIOMainPortDefault
, only Master:But I'm curious why bitmap_font chose to use this dylib at all, when libSDL2 is installed already:
and seemed to have noticed the symbol issue and used the right one:
If I symlink brew's version of the dylib into the .dylibs folder bitmap_font is looking at, it gets past this error (and onto another missing symbol in a different library). The core of my problem seems to be that mpf-mc wants to use its own dylibs when legit ones are already installed. I went looking for the logic that determines how it chooses which to use but pip install packages are a dark art I don't understand.
Is there a way to force mpf-mc to use brew's library install instead of the shared libs that it comes with? It seems to have this capacity, I have MPF installed on an M1 Mac running macOS 13 and it installed mpf-mc using the brew installed components:
Thanks for reading this far!
The text was updated successfully, but these errors were encountered: