Posted by mschuster91 23 hours ago
[1] https://research.checkpoint.com/2021/security-probe-of-qualc...
[2] https://blog.checkpoint.com/security/achilles-small-chip-big...
SDR has opened my eyes up to a lot of open-to-the-world remotely accessible interfaces, where the only protection is "you must accept harmful interference, and you should not cause harmful interference"
[1] https://github.com/rizinorg/rizin
[2] https://github.com/rizinorg/rizin/tree/dev/librz/arch/isa/he...
Utterly unbelievable that no Western government has tackled that situation. The market for basebands is completely and utterly rotten:
- Qualcomm dominates the industry and can get away with pretty much all sorts of behavior
- Samsung has their own basebands but only uses them on their premium phones
- Huawei has basebands but IIRC they're only used in data sticks and the likes, and on top of that Huawei is subject to sanctions so it's even more unlikely to see them in a major phone sold in Western markets.
- Mediatek covers the rest of the market, especially the low end.
That this lack of competition disincentivizes all actors from making investments into code quality and security is obvious to anyone who has ever looked even a bit into the phone BSP side - it's hard to imagine the baseband binary blob is any different.
Another problem is that it's a highly difficult market to enter. Pure 2G and 5G implementations exist in Osmocom, but they're practically useless in a consumer environment and anything in between is locked hard behind extremely complex standards on one side, regulatory enforcement in the middle and finally patents. Even Apple hasn't managed to kick Qualcomm and Broadcom to the curb where they belong.
A barrier not, but the testing effort required to achieve worldwide certification, not to mention the testing effort for interoperability, is enormously expensive.
As for the patents, Apple did just that by buying up what remained of Intel's baseband business and still wasn't able to deliver anything on that front in years.
They did eventually ship an Apple cellular modem in iPhone SE, without mmWave support.
That's not quite true.
https://www.reuters.com/technology/apple-reveals-first-custo...
Sure - they all have different market segments - but any of them would seem to have the ability to nibble into any others segment if they wanted to, which in turn keeps licensing costs in check.
Uh? There are Samsung Exynos devices not using a Samsung baseband? (Exynos spans a large size of the range, just not the sub-100$)
> Utterly unbelievable that no Western government has tackled that situation. The market for basebands is completely and utterly rotten:
There is a global problem that in a lot of areas there is a monopoly lock-in via standards. Those companies are growing their strategies to control the way standards are written, to make it more complicated and costly for 3rd parties to implement.
One example of making standards more complicated which adds more patents is DVB-T2 [1]. 95% of the usage of DVB-T2 compared to DVB-T1 is increasing modulation rate and improving FEC, but it also adds PLP (which I've seen maybe three demos of), which is covered by several patents, largely increasing complexity and patent cost.
FWIW, I love standards and I agree that the industry should largely participate in making standards. I agree that standards needs to have "SHALL", you can't make everything optional to allow for lower costs. And I won't pretend there is an easy solution to those problems.
Sadly, the only way I can see to improve this situation, is to increase government's public funding into standardization.
[1] FWIW, I know nothing of how DVB-T2 has been written and who did it, so it's just an example of complicated requirements increasing the number of patents and thus cost of implementation. It's possible that those requirements have been added in good faith.
The S23 for example runs a Qualcomm X70 baseband on either a Samsung Exynos or Qualcomm Snapdragon depending on region. Why that's the case (both using a Qualcomm modem and different CPU vendors for the same device), no idea.
> Sadly, the only way I can see to improve this situation, is to increase government's public funding into standardization.
IMHO, the thing that should be funded is the fundamental research. As it is, most research in the RF and codec sector is in private hands, so no wonder that things like patent trolls eventually arose.
But eh, that's not gonna happen, not in the US (for Trumpian anti-intellectualism reasons), not in Europe (we lack the money, the brains and the willpower to cut through the red tape) and not in China either.
> Utterly unbelievable that no Western government has tackled that situation
Yes, I think it would behoove most countries to do what China, Taiwan, and Korea have all done: force Qualcomm into antitrust settlement where they have to give up parts of their patent portfolio.
> That this lack of competition disincentivizes all actors from making investments into code quality and security is obvious
With the above said, I don't agree with this take at all; in my experience almost all hardware has horrible firmware, drivers/BSP, and software no matter how competitive the market, and whether it's good or not is driven almost completely by company culture / standards and not by market forces. I don't think the market selects for code quality at all. Janky low-level code is culturally pervasive across the globe. Hardware is delivered on strict fixed timeline and invisible to the end-user, and there's a very strong emphasis on functional testing and validation. This means that nobody cares what duct tape and bailing wire hacks are necessary to get units out the door as long as they pass validation, ship on time, and function. So, a culture of corner-cutting hackery is passed down from generation to generation of embedded engineer.
The only places I don't see Bad Hardware Code happen are places where Software People are present and start imposing software culture on the Hardware People (Google, Apple, etc.), or companies making an intense, concentrated effort towards becoming more Software Company oriented (Nvidia). This tends to come at a major cost, too - yes, code gets better, but in exchange, everything starts taking forever (see: Apple baseband project).