A friend recently got a (carrier-supplied) phone and has been complaining about how it would often have no reception despite showing a good signal; taking mine to the same areas on the same carrier and doing a comparison, mine was indeed showing no bars on the signal indicator. The difference is, mine predates this stupidity, and I can also see the details in the MTK Engineer Mode app, which shows the actual signal strength --- it was around -140dBm when it was showing 0 bars.
> taking mine to the same areas on the same carrier and doing a comparison
Unfortunately I don't think it's that simple. I've seen one phone simultaneously show significantly different numbers of bars for two SIMs installed in it for the same exact network and operator. After a while they become similar... then differ again... etc.
I have no clue how to explain it yet, but what I do know is that it literally makes no sense with a naive model of how these work, whether you try to explain it as reception or deception.
The phone selects a RAT (radio access technology) and frequency for each SIM slot.
After selecting, each SIM slot is subject to inter freq / inter RAT reselection / handover.
Both are controlled by messages received from the tower (e.g. on 4GLTE, for reselection, System Information messages), though there is an additional constraint: what's supported by/enabled in the phone.
Perhaps one SIM slot was in the connected state and the other was in the idle state at one point. So the reselection logic applied for one and the handover logic applied for the other. There is for example a problem called ping pong handover. Once a phone is switched to a different frequency or RAT, the tower may have the phone be sort of stuck in the new frequency, until the conditions of the previous RAT or frequency improve substantially, in order to prevent the phone being like a ping pong ball between the two. This frees resources that would otherwise be spent on repeated handover-related messages.
Each frequency has its own signal strength (free space path loss, transmit power, one frequency might be on one tower and another might be on another, etc).
This is usually for a good reason - dual sim phones are almost always “DSDS”, or “Dual SIM Dual Standby”. The secondary SIM, because it doesn’t need to make a data connection, parks itself on the lowest-frequency (and therefore usually lowest-bandwidth) connection it can find. Meanwhile, your data-connected SIM is busy trying to stream a video or upload your photos, so it’s using a higher-frequency + higher bandwidth connection, resulting in a lower signal strength.
> Meanwhile, your data-connected SIM is busy trying to stream a video or upload your photos
You're making huge and incorrect assumptions here, no? This also happens when your phone is entirely idle... and it randomly changes if you sit still for some time...
I think it has more to do with the cellular modem itself, or precisely the firmware it's running; of which there is much more diversity on the Android side.
When we visit downtown of our city, I get great data coverage. My wife, on a different model but same gen iPhone and same plan, gets nothing. Her phone shows three or four bars but her apps won't load anything.
No idea why, especially since I'm the one who installs ad blockers and such. Her phone is essentially stock.
-140 dBm is far beyond no coverage, yeah. -120 dBm is pretty much when LTE stops working (sometimes it can painfully stretch to -123 to -125 but usually not because of noise etc)
I highly recommend Network Cell Info Lite app for the network diag. It shows signal strength with all details for each of the SIM modules, shows on a map in real time where are the base station you are currently connected to, and other interesting statistics.
I implemented the same behavior in a different Google product.
I remember the PM working on this feature showing us their research on how iPhones rendered bars across different versions.
They had different spectrum ranges, one for each of maybe the last 3 iPhone versions at the time. And overlayed were lines that indicated the "breakpoints" where iPhones would show more bars.
And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
We tried to implement something that matched the most recent iPhone version.
To be sure, is it possible that, on each subsequent iPhone release, the hardware got better at handling weak signals, and thus a mediocre signal for iPhone N was decent for iPhone N+2 and would give great throughput on iPhone N+4?
Possible sure, but wouldn't it be better marketing for the iphone to have better performance on lower bars? Phones are judged for their performance, but network providers for the number of bars they show on the screen.
The comment you’re replying to is incredibly concerning. Is he saying people at Google are purposefully misrepresenting signal strength so they can “compete” with Apple?
Bars really don’t matter. You can have full bars and slow to no internet. You can have one bar but relatively decent internet. Honestly kind of wish the signal display would go away and instead show me when I lose internet.
When you lose internet, you get a ! next to the bars (at least I have on my last few androids). Usually I also have no bars when I lose internet, but sometimes I've got coverage without data flow.
That is literally what i am observing lately with my provider: i have 2 bars and yet i do not have internet, where as my gf, using the same iPhone model, with a different provider, having 2 bars, has perfect data connectivity.
I build apps at the moment, in addition to the phone's network indicators you really should provide your user with visible and live feedback to indicate whether the servers are reachable because there's so many things that can break down in between. Also programming your app for offline-first is good unless it's critically important the information is either live or absent. We allow offline access by using React Query and storing its caches in user storage.
> And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
One thing explaining this might be that advancements in antenna design, RF component selection including the actual circuit board and especially (digital) signal processing allow a baseband to get an useful signal out of signal strengths that would have been just noise for older technology.
In ham radio in particular, the progress is amazing. You can do FT8 worldwide (!) communication on less than 5 watts of power, that's absolutely insane.
"Tests carried out by research group PolicyTracker, and shared with BBC's Morning Live, found that nearly 40% of the time a phone displays the 5G symbol, it is actually using a 4G connection"
I worked for a mobile network company a few years ago, the vibe I got there was that 5G penetration was still years away and that none of the providers were anywhere near ready for it.
Interestingly that company built a bridge of sorts allowing providers to get more life out of their older hard and software, converting e.g. 5G signals to 4G and 4G to 3G (where a signal is for example a phone phoning home telling the provider they used a megabyte of data, or looking up the IP address when calling a phone number)
Also where 2/3/4G network signals were all their own protocols (RADIUS and DIAMETER), 5G is just HTTP. And where for the 3G/4G stuff they had to write their own code to handle the protocols, for the 5G stuff they just used the cURL library. That is, cURL powers 5G networks.
> You know, I don't recall ever seeing 1 bar of signal strength on a smartphone.
I do.
I'm from Germany, land of perpetual EDGEing. Highest total GDP in the EU but can't build a mobile network for the life of it.
Then again we somehow forgot how to run trains and build cars without cheating, so I guess it fits.
Want to see a single bar? Come visit, our carriers aren't on the list with that inflate flag enabled. I guess they didn't get the same memo as the car manufacturers ;D
The boneheaded decision to "privatize" rail by creating a state-owned corporation competing with itself, a network of regional corporations with extremely inconsistent funding and separate corporations for various services like literally maintaining infrastructure definitely has resulted in "too much DB". Although I sincerely doubt attempting to actually open up those "markets" by introducing foreign corporations like National Express really does anything other than cannibalize rail services even further.
I still can't get over the justification for abandoning the €9/month universal ticket experiment (and replacing it with a €49/month offering which has since been bumped to €58/month and will soon be raised to €63/month) officially being in part that "rail will be worse when more people use it" (the other mostly being "not enough people used it to demonstrate its value" and "people used the ticket for trips they otherwise wouldn't have been able to afford to make").
We should just nationalize it all properly and make it free at point of service. Let tourists use it for free too, obviously. Infrastructure exists so the economy can happen, its ROI is a functioning industry and society so stop trying to pretend we can reasonably measure its success in profit.
I feel you. We have stellar coverage pretty much everywhere in NL. Heck, I was recently in a work video meeting in the car, not a single drop. The route included part of this:
Interesting to see you have a different experience. I'm not sure I would call it stellar. On the train route between Den Haag and Amsterdam, one of the busiest routes in the country presumably, reception is constantly dropping out. I'd love to be able to work on the train, but it's completely impossible if you need a network connection for anything.
Perhaps the route being so busy is the cause of the connectivity issues, but it's still baffling to me how bad it is, given that the amount of mobile devices trying to connect must be very predictable.
+1 on the train, mobile internet in the train is really bad. I kinda get it because you're in a faraday cage, moving between cells quickly, and frequently far outside of inhabited areas but still.
I'm pretty sure the in-train internet also relies on mobile networks, so that's unreliable too. Plus any bandwidth is taken up by people scrolling through tiktok.
Germany is a developing country when it comes to broadband let alone wireless internet.
The short version is that the chancellor we had in the 1990s didn't like how the public broadcasting channels were talking about his failures and wanted to push the development of private broadcasters (who being beholden to financial interests rather than objective news coverage mostly spoke favorably of him) by prioritizing cable television over fiber. A surprising number of things came downstream from that pivotal decision, e.g. the completely braindead way we sold frequency bands (which resulted in some literally remaining unused because there were initially no requirements to actually do anything with them).
> Highest total GDP in the EU but can't build a mobile network for the life of it.
GDP per capita (or GDP per square metre) would be a more useful indication here. Otherwise, you could throw a bunch of poor countries together--just for purposes of statistics, and expect a better mobile network?
GDP per square metre is probably the best metric, even though it's the more rarely used one. [1] has a neat map of Europe by GDP density.
However Germany is still very high in both GDP per capita and GDP per land area. Roughly on par with the UK, and far higher than France which has a much better mobile network
> GDP per square metre is probably the best metric
GDP per square metre only really works for countries with uniform population density. For example, by European standards, Spain is huge, and basically entirely empty outside of a handful of cities...
> GDP per square metre is probably the best metric, even though it's the more rarely used one. [1] has a neat map of Europe by GDP density.
Well, it would be the best metric, if your country was homogeneously populated.
If everyone lives in one big city and there's literally no one in the rest of the country, then I expect mobile reception (and every other service) to be pretty good for everyone, because they all stay in the big city.
> However Germany is still very high in both GDP per capita and GDP per land area. Roughly on par with the UK, and far higher than France which has a much better mobile network
Yes, France, Germany and UK are all equal enough in these measures (well within an order of magnitude) that the much bigger difference in mobile networks is most likely due to some other factors.
Luckily Germany is pretty homogeneously populated. Far more so than the UK (England is pretty even, but Scotland is far emptier) or France (1/5th live in the Paris metro area).
Thing is, mobile networks are national affairs. A bunch of small countries has a lot of small telcos. Germany has 3 (2? not sure with the mergers) large telcos.
There are some economics of scale that work best at the country level.
Even with the EU single market, mobile phone operations almost always follow country borders. You'll get a different set of providers in Germany than you'll get one km away on the other side of the Rhine in France. Even though some of them may have the same name or the same ultimate owner or both, and even though you can roam on the other side of the border, you'll have a contract with a different entity, and different people will build and maintain the networking equipment.
Conversely, in the US, the major carriers all have nationwide coverage.
I also do, I'm Australian. I regularly experience both congestion caused by tower over-subscription as well as traveling waaaay out into the country where there's no reception, even on the Telstra network that boasts better coverage than everyone else by a mile.
I rarely encounter outright congestion in Australia tbh, but then again I avoid watching videos on the train.. so that's probably indicative of something :D
Coverage is decent on Telstra, but if you're out of town reception is rarely any good, presumably because there's little to no incentive to improve it when there's no on around to need it.
Better coverage may be claimed. But as you know, Australia is a big place.
The few farmers I know have a rough idea of the on-the-ground cell coverage. They say things like "this side of the hill/town" usually. I've seen them deliberately walk to the other side of a silo to make a call.
I assume that the coverage maps are assumed cell-tower-coverage-if-shit-is-not-in-the-way. No surprise radios are common.
A Diné (Navajo) slang word for "cellphone" is "bił nijoobałí" which means "the thing you spin around with". Coverage on rez is not great you see, and in some places is so marginal that whether you get a usable signal depends not just on position but orientation...
Tangent but this is a pretty interesting topic. I've heard people speculate that local politics deliberately prevents such infrastructure, waiting for some kind of kickbacks to make it worth their while. Others suggest that it happens because federal telecom subsidies aimed at improving rural connectivity don't apply, as a kind of retaliation for tribal sovereignty. Way off-grid, ok, maybe it's simply not worth it to corporate telecom, but whatever the cause coverage even in fairly populated areas around Kayenta/Monument Valley is also quite bad in a way that would be infrequent in comparable communities in say, nowhere Appalachia.
Many a suburban parent of smart-phone addicted children would romanticize the whole thing and actually be kind of jealous of a situation like that. Years back and on the other side of the world, tourists were very scandalized about more roads and towers around Annapurna in Nepal.. but of course the locals usually do not actually like to be cut off from the world.
More telecom is probably good despite the evils, but fuck commercial billboards in particular. Those are still creeping closer to the Grand Canyons and Yosemites, and they suck whether it's for multinationals like McDonalds, or for locally owned gas stations or hotels that put cash into tribal communities. Ban them all like Hawaii, and everyone will be astonished to learn that the world keeps turning..
Moving to Germany from countries where mobile networks function is traumatic. My welcome experience was USB stick with faulty drivers, balance zeroed immediately because of not activated packet, then sipping expensive 1GB data packets over choppy connections. Of course that was all my fault. The only reliable thing was monthly billing and enforcement of contract length by the telecom. When I heard before arrival "there is no internet in the apartment but you can simply buy USB stick" I had subconsciously felt there will be problems. Fuck, I hate these memories so much. Fuck everything about it and everyone involved.
I work with cellular BDA-DAS[1] gear sometimes, and I don't recall the last time I looked at the signal strength display on my phone. It has probably been years.
For me: It either works, or it doesn't work. It is either fast-enough, or impossibly-slow. It's very binary, and the bar graph at the top never told me a damned thing about what I should expect.
[1]: Bi-Directional Amplifier, Distributed Antenna System. In theory, such constructs can make indoor cellular coverage quite good inside of buildings that previously had none. In reality it can be... complicated. And while the bar graph doesn't mean anything, I still need ways to see what's happening as I spend hours, days, or [sometimes!] weeks surveying and troubleshoot and stuff. The phone can report things like RSRP, RSRQ, and some other tasty bits instead of just a useless graph -- and from there, I can sometimes make a hand-waving guess as to what I may reasonably expect for performance.
But that stuff is normally pretty well hidden from view.
> It is either fast-enough, or impossibly-slow. It's very binary, and the bar graph at the top never told me a damned thing about what I should expect.
A few months ago, I was in a remote area at anchor on a sailboat, about 6.5 miles from the nearest highway through the swamp, with only a few farms and a handful of houses within that radius. With my phone up in the cockpit of the boat and tethered over WiFi to my laptop, I was able to download a movie. As the boat swung on anchor, the download was occasionally interrupted, but when data was flowing it was consistently 5-10 MB/s over a claimed 5G link; the movie downloaded in much less time than its runtime. I assume I wasn't competing with much other traffic on that tower, wherever it was. So my experience was even more binary than yours.
The phone's signal indicator did seem to accurately indicate when it had no usable signal at all, but beyond that I'm not sure it was providing any useful information. And I'm not sure if it could have told me anything of use other than "connected" or "not connected". The very marginal connection was still faster than I had any right to expect for those conditions.
I had my car break down in remote mountains and that little image had me climb up trees and eventually find a place where I could make a call from. Once I had two bars they could hear me, before that it was the case that they weren't getting what I was transmitting.
I had a very dangerous 1-bar the other day. You see I was in the Canadian wilderness relying on iPhone text-over-satellite, which works well, but only when you have no signal. I needed to relay a message to the rest of my group when suddenly I find myself with one bar of 3G that was completely and totally inoperable. No messages were getting through. But to make matters worse, since my phone thought it had a bar, it wouldn't activate satellite. I tried every setting and then for 20 minutes hiding behind various rocks to try to get my one bar to go away when finally I found a spot that would let me satellite text again.
Our house is in kind of a hollow despite being in a city, and I (and guests - all networks seem just as bad) get one bar basically all the time at home.
Phone calls are hit-and miss without WiFi calling switched on.
Try going into a Home Depot. I don't think I've ever found one where aside from fairly near the front I've had other than 0 or 1 bars, across a variety of phones and carriers and in neighborhoods where the signal outside the store was strong.
The net is telling me this is because of the aisle after aisle of tall metal shelving and the building itself also has a lot of metal in the construction.
It is quite annoying when you are trying to use the Home Depot app to look up something.
I’ve always just blamed the extreme bloat of the web and lack of design around poor connections for 2bar lack of performance. HN usual works fine on but that’s about it for sites I visit.
Android phones show 1 bar pretty reasonably and fair. To illustrate this, I have 1 bar on both my SIM modules right now, which translates to the -125 dBm signal on both. So the connection is up, but it is borderline low.
Consider yourself blessed, the one place in my neighborhood where I get one bar on LTE is the same place I once was repairing my car. Awful experience but the rest of the subdivision is fine.
Heh, my phone consistently reports 1 bar inside my apartment within a major metropolitan area. Indeed binary, because it works enough for the few times I actually take calls not on wifi.
I see it all the time driving through the country. Probably a dozen times just today driving through the american east coast. I agree that two bars is the bare minimum for any functionality though.
IIRC this really took off with the antennagate fiasco on the iphone 4. I was working for Verizon at the time and this was also the first one we were able to sell. I forget who it was that did it but I believe it was Apple in response to people "holding their phone wrong" so they bumped everything up a bar so you couldn't tell. There was a lot of competition at the time but also all the androids had better margins so they wanted us to sell those instead.
Heh, funny. I recently implemented a countdown for a teleprompting app and that's exactly what I ended up doing to make the countdown "feel right".
The countdown in question doesn't display fractions of a second so it would immediately switch from "5 seconds left" to "4 seconds left" which just doesn't feel right. Adding 0.5s solved the issue.
If you're counting up, round down. If you're counting down, round up. A human expects the count to finish at precisely the moment we get to the last number in the sequence (zero, for counting down). Do a count in your head to see what I mean.
Apple chose a compromise by rounding to nearest, for it to "feel good", but you lose the ability to exactly predict when the timer ends as a human. Typical Apple.
From looking at the bottom of the linked post (which says it was edited, not sure when in relation to your comment), it sounds like they wanted something that worked across arbitrary times split across units (hour/minute/seconds) without having to handle carry-over. I'm not sure I would choose to alter the times themselves over making the math a bit more complex, but the author has obviously thought about this a lot more than me, and it's nice that they at least considered that alternative.
Is there any reason to believe this mechanism is actually there to help carriers deceive users? To me this looks like it's intended to address some other issue, like perhaps "I have zero bars shown, what do you mean I'm still connected? That that clearly means I'm disconnected..." I feel like anything intended to lie to users would not be implemented in this manner.
Think about all the various policies, dishonesty, PR spin, marketing, price-gouging, hidden fees, elimination of lifetime programs, and yes, outright fraud you've become aware of over the years. Just sit quietly for a moment, let those ideas stew. And if after one minute of silence you still feel the need to bestow upon these companies the most generous interpretation of their conduct possible, well, I'll be slightly surprised but I suppose that would conclude our conversation.
I think you missed important nuance in my comment. Note in my comment I was asking about this mechanism. I'm not suggesting the companies involved wouldn't deceive users. I merely question whether this is how they would do it. It feels way too simple, coarse, obvious/public, and inflexible if the goal is to trick users.
With Three UK I used gathered evidence over the course of 4 months to wiggle myself out of a £46/month 28-month 5G contract (had to pay £200 remaining on my iPhone 16 Pro) when I demonstrated that my phone was basically useless whenever in the postcode are where I live, even if I always had 1 bar 5G signal.
Not even phone calls would go through, let alone calls on Whatsapp et al, or loading websites using something heavier than just text.
Have raised a _formal_ complaint (they must report it to Ofcom), and after that it was just a matter of ensuring I lost enough phone calls to demonstrate how many ended up in my answering machine.
The fact that Wifi calling is also super buggy and almost never work, played also a big role.
My problem is, all other mobile providers in my area are even worse, showing LTE or 4G. So I just need to wait for them to strengthen signal, or move!
I'm a former Three user in central London. When I started it was good, then they advertised cheap unlimited data contracts which overloaded their system and they became close to unusable. You'd order an Uber, go down to meet it and be stuck because there was zero data. It wasn't a signal strength thing - it was a system overload thing.
I'm now on O2 which works kind of normally and also have a silent link esim which is a good backup. They cost like £8, never expire and let you use any UK network you choose if one isn't working. Or almost any network globally for that matter.
WiFi calling is the one of the most improperly implemented feature by carriers. Some just straight up deny WiFi calling if you're in airplane mode but connected to a WiFi.
If you use some app such as Network Survey (open source, Android), you can see that providers also lie about the type of connection. I'm on LTE now, but provider makes phone display 5G.
And this is on non-provider phone, this is built in in the whatever communication they do with the phone, possibly works with every device.
Technically not lying if this is NSA (non-standalone) 5G. The 5G band just comes on as an additional aggregated band. The icon just shows up because the tower is capable of supplying the band.
Really the bigger problem is that there's not enough distinction between SA and NSA
Until about March this year, it was excellent and I used it as my home broadband. 60MB/s down, 20MB/s up on a good day. Much better than any ADSL I'm able to get.
Since March, from about 10:30am until 5pm some days, and late evening other days, there is no working data, and occasionally no working voice, despite the 5 bars.
It's working fine until then, and then it just stops completely, fading over the course of maybe 10 minutes. This happens all 7 days of the week.
The working theory is congestion at the base station. That's consistent with the occasional 6 minute ping times that I've measured, and more usual 20-30 second ping times, when anything gets through at all.
Still shows 5 bars. Three's coverage map says it's good here. Just can't use it.
I would rather see a live speed test number, emoji, or something. The signal frequency or strength doesn’t matter if the tower equipment is overloaded with users and running at dialup speeds.
I’ve been in bad tower areas where the solution is to drive to the next town or tower along the highway.
Maybe because the signal strength might not work as users expect?
Signal strength is like the loudness of music being heard. It's possible for music to be quiet but otherwise excellent, or loud but low-quality. However, if it is too quiet, then the "music" becomes almost unintelligible, which the offseted bars should still be able to indicate.
In Wi-Fi, 6GHz and 5GHz are often used instead of 2.4GHz. 2.4GHz would likely win in signal strength. Yet, the others are used anyway, because the others are good for other reasons. However, if range (
...or compatibility) is critical, then 2.4GHz is used.
Similarly, in cellular, there is a lower frequency e.g. band 8/12/14/17/20/28/71 and a higher frequency e.g. band 1/3/7/30/38/40/41/66/77/78. (Less basically, it can be more granular.)
So this sequence of events is possible: Tower switches the phone to a higher frequency -> speed increases but the signal strength reduces (confusing, but at least doesn't seem bad if there are 3 or 4 bars.) A switch to a lower frequency normally occurs instead if the high frequency signal is weak.
Cellular can be slow due to interference (maybe more common than signal strength issues; the metric to use instead might be SNR/SINR), congestion (maybe more common than signal strength issues; the metric to use instead is confusing, maybe the CFI value (if automatically changed) or RSRQ with a high SNR/SINR might rule it out), the speed of the rest of the network (the metric to use might be RSRQ during a download with a high SNR/SINR), data plan (the metric to use instead might be RSRQ during a download with a high SNR or SINR/QCI (requires interpretation)), and the width of it (the metric to use is BW). So it's confusing, and not exactly that full bars are always better.
For 2G, with each nearby cell (coverage area) basically getting its own channels, signal strength might've been more important, though interference was there somewhat (so there was MAIO planning etc.)
But aside from speed, there's the battery to consider. If the signal strength from the tower to the phone is "satisfactory", it's implied that so is the signal from the phone to the tower, so the phone will have to have an elevated transmit power.
There is a logical and reasonable explanation. These companies are run by a bunch of sociopathic, unethical people who won't hesitate to lie and cheat if it gets them more money. It's as simple as that.
If my boss asks me to do something, and I refuse, what do you imagine will happen? I will get fired for insubordination. It's pretty easy to get people onboard if their paycheck depends on it.
You'd need to have this specific request from supervisor on paper, with witnesses, all video recorded with supervisor's consent. Otherwise you'd be sued into poverty for besmirching.
In AirBnB and Booking.com apartments the camera is off when the owner says "don't worry, it's turned off". The camera is turned off even harder when English is not the main language of the country.
private int getNumLevels() {
if (mConfig.inflateSignalStrengths) {
return SignalStrength.NUM_SIGNAL_STRENGTH_BINS + 1;
}
return SignalStrength.NUM_SIGNAL_STRENGTH_BINS;
}
...
} else if (mCurrentState.connected) {
int level = mCurrentState.level;
if (mConfig.inflateSignalStrengths) {
level++;
}
return SignalDrawable.getState(level, getNumLevels(),
mCurrentState.inetCondition == 0);
If the flag is true, bump up BOTH the reported level as well as the total number of bins.
If the flag is false, use reported level and default number of bins.
Since both numerator and denominator are bumped up, is it really malicious?
Based on this commit at least, personally, I feel such logic could be due to a decision to shift from levels starting from 0 to levels starting from 1 at the UI level.
Or perhaps to make levels consistent between different operators, some of whom were using 0-based while others used 1-based.
I haven't gone through later commits or latest versions. So my opinion's limited just to this original 2017 change.
I'm surprised they even display the name and email of the person responsible for doing it. If I were forced to make such a change that I knew would be publicly displayed, I'd do everything possible to disclaim it (such as mentioning the one who actually requested it.)
That is a tricky one. I caught myself comparing bars to a friend’s phone before wondering if changing carriers would give me better signal in a certain area.
I frequently find that my data service is completely broken even when I have full 5G bars. Inflating by one is lame but doesn't explain this behavior. Is this a T-Mobile thing or is it widespread these days? I don't remember it happening so much 3+ years ago.
Signal strength is a measure of how proximate you are to the tower in terms of radio connectivity, but it says nothing about whether or not the tower will respond to you in a timely fashion, the tower backhaul capacity, etc. Usually this happens because you have a great connection to the tower in theory, but in-practice you can't get meaningful bandwidth and everything appears broken. This is really common at sporting events and other large crowd gatherings, which is also why a lot of the promise of 5G was that increased work with OFDMA in trying to service more customers in the same physical space adequately.
It's probably a reasonable pitch to say that phones should instead display something closer to "meaningful available bandwidth" crossed with strength, because a strong signal doesn't mean a good connection.
Maybe related to 5G? There are a couple spots near me (in particular, a somewhat crowded open mall) where I have solid bars but zero connectivity. Dropping to 4G works in most cases.
if we cared about signal strength, we'd make it part of the telecommunications regulated sphere: you must back your meter with a path which shows signal strength accurate to xDB in some yUnits of zQuality measured at one of A,B conforming labs
Poking around the config files, AT&T and two other carriers (both of which are subsidiaries, from a quick Google) seem to display 3G connections as if they were 4G:
Android documents[0] this flag, which they don't appear to do for the `inflate_signal_strength_bool` field outside the source code from what I can tell. It seems like there a bunch of odd flags for controlling user-exposed visuals - another flag `show_4g_for_lte_data_icon_bool` is used by 96 carriers, for example.
I wonder if there's some odd telecom history behind these, or if these flags were intended for some kind of edge-case. It seems like carriers have the option to arbitrarily override the thresholds used for determining signal strength[1], but only four carriers actually do. All only elect to customize the `lte_rsrp_thresholds_int_array` field; and all opt to make things harder for themselves, reporting their network connection as lower strength than the default classification[2] would:
The same is done without modifying Android, likely nearly everywhere in the world, but maybe not every provider. Provider sends a config information of "Network Override" and can make your phone display any network type. I see this happening in Network Survey app (open source) with my provider.
AT&T has a history of lying about what its network is. They were advertising HSPA+ as 4G and then recently started advertising LTE as "5G E". I can't find a lot of articles about the 4G branding one since the 5G one started.
> show_4g_for_lte_data_icon_bool
Realistically I think this is just a choice that many carriers made. It's quite common to see 4G instead of LTE outside of the US. Technically speaking I think WiMAX counted as 4G when there were competing 4G standards and you could make an argument that LTE is just one of the 4G standards.
Because it's about signal strength, but also the hostname is "nick vs networking". And if your site can't handle a few hundred requests per second (which I seriously doubt the HN traffic is anywhere close to) on a static web page, then you're doing something wrong
I assume the irony is that an article about signal strength, which is closely related to data speed, is loading slowly because the site servers have slow data speed.
The signal strength measurement is actually standardised: https://en.wikipedia.org/wiki/Mobile_phone_signal#ASU
Unfortunately I don't think it's that simple. I've seen one phone simultaneously show significantly different numbers of bars for two SIMs installed in it for the same exact network and operator. After a while they become similar... then differ again... etc.
I have no clue how to explain it yet, but what I do know is that it literally makes no sense with a naive model of how these work, whether you try to explain it as reception or deception.
After selecting, each SIM slot is subject to inter freq / inter RAT reselection / handover.
Both are controlled by messages received from the tower (e.g. on 4GLTE, for reselection, System Information messages), though there is an additional constraint: what's supported by/enabled in the phone.
Perhaps one SIM slot was in the connected state and the other was in the idle state at one point. So the reselection logic applied for one and the handover logic applied for the other. There is for example a problem called ping pong handover. Once a phone is switched to a different frequency or RAT, the tower may have the phone be sort of stuck in the new frequency, until the conditions of the previous RAT or frequency improve substantially, in order to prevent the phone being like a ping pong ball between the two. This frees resources that would otherwise be spent on repeated handover-related messages.
Each frequency has its own signal strength (free space path loss, transmit power, one frequency might be on one tower and another might be on another, etc).
You're making huge and incorrect assumptions here, no? This also happens when your phone is entirely idle... and it randomly changes if you sit still for some time...
Android is quiet lazy searching for towers.
This suggests that the issue is not related to Android.
No idea why, especially since I'm the one who installs ad blockers and such. Her phone is essentially stock.
Some generations, different Apple models have pretty different radios. Is there a difference in bands or ?
That might be the worst app I’ve used on my iPhone in a year. Better off vibe coding an app to give you signal strength.
I remember the PM working on this feature showing us their research on how iPhones rendered bars across different versions.
They had different spectrum ranges, one for each of maybe the last 3 iPhone versions at the time. And overlayed were lines that indicated the "breakpoints" where iPhones would show more bars.
And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
We tried to implement something that matched the most recent iPhone version.
So, game-theoretic evil?
One thing explaining this might be that advancements in antenna design, RF component selection including the actual circuit board and especially (digital) signal processing allow a baseband to get an useful signal out of signal strengths that would have been just noise for older technology.
In ham radio in particular, the progress is amazing. You can do FT8 worldwide (!) communication on less than 5 watts of power, that's absolutely insane.
"Tests carried out by research group PolicyTracker, and shared with BBC's Morning Live, found that nearly 40% of the time a phone displays the 5G symbol, it is actually using a 4G connection"
Interestingly that company built a bridge of sorts allowing providers to get more life out of their older hard and software, converting e.g. 5G signals to 4G and 4G to 3G (where a signal is for example a phone phoning home telling the provider they used a megabyte of data, or looking up the IP address when calling a phone number)
Also where 2/3/4G network signals were all their own protocols (RADIUS and DIAMETER), 5G is just HTTP. And where for the 3G/4G stuff they had to write their own code to handle the protocols, for the 5G stuff they just used the cURL library. That is, cURL powers 5G networks.
Human brains: wow, what a bunch of suckers. Damn.
By the way, is it legal to be deceptive in this way?
I do.
I'm from Germany, land of perpetual EDGEing. Highest total GDP in the EU but can't build a mobile network for the life of it.
Then again we somehow forgot how to run trains and build cars without cheating, so I guess it fits.
Want to see a single bar? Come visit, our carriers aren't on the list with that inflate flag enabled. I guess they didn't get the same memo as the car manufacturers ;D
> Then again we somehow forgot how to run trains
The mobile networks don't have enough dB and the trains have too much DB?
I still can't get over the justification for abandoning the €9/month universal ticket experiment (and replacing it with a €49/month offering which has since been bumped to €58/month and will soon be raised to €63/month) officially being in part that "rail will be worse when more people use it" (the other mostly being "not enough people used it to demonstrate its value" and "people used the ticket for trips they otherwise wouldn't have been able to afford to make").
We should just nationalize it all properly and make it free at point of service. Let tourists use it for free too, obviously. Infrastructure exists so the economy can happen, its ROI is a functioning industry and society so stop trying to pretend we can reasonably measure its success in profit.
https://en.wikipedia.org/wiki/Afsluitdijk
Yet, when we visit family in Germany, five minutes after crossing the border we are in a cellular dead zone.
Perhaps the route being so busy is the cause of the connectivity issues, but it's still baffling to me how bad it is, given that the amount of mobile devices trying to connect must be very predictable.
I'm pretty sure the in-train internet also relies on mobile networks, so that's unreliable too. Plus any bandwidth is taken up by people scrolling through tiktok.
The short version is that the chancellor we had in the 1990s didn't like how the public broadcasting channels were talking about his failures and wanted to push the development of private broadcasters (who being beholden to financial interests rather than objective news coverage mostly spoke favorably of him) by prioritizing cable television over fiber. A surprising number of things came downstream from that pivotal decision, e.g. the completely braindead way we sold frequency bands (which resulted in some literally remaining unused because there were initially no requirements to actually do anything with them).
GDP per capita (or GDP per square metre) would be a more useful indication here. Otherwise, you could throw a bunch of poor countries together--just for purposes of statistics, and expect a better mobile network?
However Germany is still very high in both GDP per capita and GDP per land area. Roughly on par with the UK, and far higher than France which has a much better mobile network
1: https://ssz.fr/gdp/
GDP per square metre only really works for countries with uniform population density. For example, by European standards, Spain is huge, and basically entirely empty outside of a handful of cities...
Well, it would be the best metric, if your country was homogeneously populated.
If everyone lives in one big city and there's literally no one in the rest of the country, then I expect mobile reception (and every other service) to be pretty good for everyone, because they all stay in the big city.
> However Germany is still very high in both GDP per capita and GDP per land area. Roughly on par with the UK, and far higher than France which has a much better mobile network
Yes, France, Germany and UK are all equal enough in these measures (well within an order of magnitude) that the much bigger difference in mobile networks is most likely due to some other factors.
Even with the EU single market, mobile phone operations almost always follow country borders. You'll get a different set of providers in Germany than you'll get one km away on the other side of the Rhine in France. Even though some of them may have the same name or the same ultimate owner or both, and even though you can roam on the other side of the border, you'll have a contract with a different entity, and different people will build and maintain the networking equipment.
Conversely, in the US, the major carriers all have nationwide coverage.
Coverage is decent on Telstra, but if you're out of town reception is rarely any good, presumably because there's little to no incentive to improve it when there's no on around to need it.
The few farmers I know have a rough idea of the on-the-ground cell coverage. They say things like "this side of the hill/town" usually. I've seen them deliberately walk to the other side of a silo to make a call.
I assume that the coverage maps are assumed cell-tower-coverage-if-shit-is-not-in-the-way. No surprise radios are common.
Tangent but this is a pretty interesting topic. I've heard people speculate that local politics deliberately prevents such infrastructure, waiting for some kind of kickbacks to make it worth their while. Others suggest that it happens because federal telecom subsidies aimed at improving rural connectivity don't apply, as a kind of retaliation for tribal sovereignty. Way off-grid, ok, maybe it's simply not worth it to corporate telecom, but whatever the cause coverage even in fairly populated areas around Kayenta/Monument Valley is also quite bad in a way that would be infrequent in comparable communities in say, nowhere Appalachia.
Many a suburban parent of smart-phone addicted children would romanticize the whole thing and actually be kind of jealous of a situation like that. Years back and on the other side of the world, tourists were very scandalized about more roads and towers around Annapurna in Nepal.. but of course the locals usually do not actually like to be cut off from the world.
More telecom is probably good despite the evils, but fuck commercial billboards in particular. Those are still creeping closer to the Grand Canyons and Yosemites, and they suck whether it's for multinationals like McDonalds, or for locally owned gas stations or hotels that put cash into tribal communities. Ban them all like Hawaii, and everyone will be astonished to learn that the world keeps turning..
I work with cellular BDA-DAS[1] gear sometimes, and I don't recall the last time I looked at the signal strength display on my phone. It has probably been years.
For me: It either works, or it doesn't work. It is either fast-enough, or impossibly-slow. It's very binary, and the bar graph at the top never told me a damned thing about what I should expect.
[1]: Bi-Directional Amplifier, Distributed Antenna System. In theory, such constructs can make indoor cellular coverage quite good inside of buildings that previously had none. In reality it can be... complicated. And while the bar graph doesn't mean anything, I still need ways to see what's happening as I spend hours, days, or [sometimes!] weeks surveying and troubleshoot and stuff. The phone can report things like RSRP, RSRQ, and some other tasty bits instead of just a useless graph -- and from there, I can sometimes make a hand-waving guess as to what I may reasonably expect for performance.
But that stuff is normally pretty well hidden from view.
A few months ago, I was in a remote area at anchor on a sailboat, about 6.5 miles from the nearest highway through the swamp, with only a few farms and a handful of houses within that radius. With my phone up in the cockpit of the boat and tethered over WiFi to my laptop, I was able to download a movie. As the boat swung on anchor, the download was occasionally interrupted, but when data was flowing it was consistently 5-10 MB/s over a claimed 5G link; the movie downloaded in much less time than its runtime. I assume I wasn't competing with much other traffic on that tower, wherever it was. So my experience was even more binary than yours.
The phone's signal indicator did seem to accurately indicate when it had no usable signal at all, but beyond that I'm not sure it was providing any useful information. And I'm not sure if it could have told me anything of use other than "connected" or "not connected". The very marginal connection was still faster than I had any right to expect for those conditions.
Phone calls are hit-and miss without WiFi calling switched on.
The net is telling me this is because of the aisle after aisle of tall metal shelving and the building itself also has a lot of metal in the construction.
It is quite annoying when you are trying to use the Home Depot app to look up something.
They finally added WiFi a year ago or so.
I hated having to walk near the doors to send a “was it this” question to the wife.
But one bar is death for Internet - though HN will often load; anything heavier won’t.
Wifi-calling to the rescue :)
That's not something I was expecting to hear
But then of course if you can push a customer one way or the other it will be to the higher margin product.
(Probably a way to do it on Android, too)
A CSR showed me this while debugging network connectivity issues with my phone.
Like what Apple does with stopwatch.
https://lukashermann.dev/writing/why-the-iphone-timer-displa...
The countdown in question doesn't display fractions of a second so it would immediately switch from "5 seconds left" to "4 seconds left" which just doesn't feel right. Adding 0.5s solved the issue.
If you're counting up, round down. If you're counting down, round up. A human expects the count to finish at precisely the moment we get to the last number in the sequence (zero, for counting down). Do a count in your head to see what I mean.
Apple chose a compromise by rounding to nearest, for it to "feel good", but you lose the ability to exactly predict when the timer ends as a human. Typical Apple.
This signal strength is straight up lying about the actual signal strength
Not even phone calls would go through, let alone calls on Whatsapp et al, or loading websites using something heavier than just text.
Have raised a _formal_ complaint (they must report it to Ofcom), and after that it was just a matter of ensuring I lost enough phone calls to demonstrate how many ended up in my answering machine.
The fact that Wifi calling is also super buggy and almost never work, played also a big role.
My problem is, all other mobile providers in my area are even worse, showing LTE or 4G. So I just need to wait for them to strengthen signal, or move!
I'm now on O2 which works kind of normally and also have a silent link esim which is a good backup. They cost like £8, never expire and let you use any UK network you choose if one isn't working. Or almost any network globally for that matter.
[0] https://en.wikipedia.org/wiki/Femtocell
And this is on non-provider phone, this is built in in the whatever communication they do with the phone, possibly works with every device.
Really the bigger problem is that there's not enough distinction between SA and NSA
Man, I love my HSPA+ “4G”!
Until about March this year, it was excellent and I used it as my home broadband. 60MB/s down, 20MB/s up on a good day. Much better than any ADSL I'm able to get.
Since March, from about 10:30am until 5pm some days, and late evening other days, there is no working data, and occasionally no working voice, despite the 5 bars.
It's working fine until then, and then it just stops completely, fading over the course of maybe 10 minutes. This happens all 7 days of the week.
The working theory is congestion at the base station. That's consistent with the occasional 6 minute ping times that I've measured, and more usual 20-30 second ping times, when anything gets through at all.
Still shows 5 bars. Three's coverage map says it's good here. Just can't use it.
I’ve been in bad tower areas where the solution is to drive to the next town or tower along the highway.
Signal strength is like the loudness of music being heard. It's possible for music to be quiet but otherwise excellent, or loud but low-quality. However, if it is too quiet, then the "music" becomes almost unintelligible, which the offseted bars should still be able to indicate.
In Wi-Fi, 6GHz and 5GHz are often used instead of 2.4GHz. 2.4GHz would likely win in signal strength. Yet, the others are used anyway, because the others are good for other reasons. However, if range ( ...or compatibility) is critical, then 2.4GHz is used.
Similarly, in cellular, there is a lower frequency e.g. band 8/12/14/17/20/28/71 and a higher frequency e.g. band 1/3/7/30/38/40/41/66/77/78. (Less basically, it can be more granular.)
So this sequence of events is possible: Tower switches the phone to a higher frequency -> speed increases but the signal strength reduces (confusing, but at least doesn't seem bad if there are 3 or 4 bars.) A switch to a lower frequency normally occurs instead if the high frequency signal is weak.
Cellular can be slow due to interference (maybe more common than signal strength issues; the metric to use instead might be SNR/SINR), congestion (maybe more common than signal strength issues; the metric to use instead is confusing, maybe the CFI value (if automatically changed) or RSRQ with a high SNR/SINR might rule it out), the speed of the rest of the network (the metric to use might be RSRQ during a download with a high SNR/SINR), data plan (the metric to use instead might be RSRQ during a download with a high SNR or SINR/QCI (requires interpretation)), and the width of it (the metric to use is BW). So it's confusing, and not exactly that full bars are always better.
For 2G, with each nearby cell (coverage area) basically getting its own channels, signal strength might've been more important, though interference was there somewhat (so there was MAIO planning etc.)
But aside from speed, there's the battery to consider. If the signal strength from the tower to the phone is "satisfactory", it's implied that so is the signal from the phone to the tower, so the phone will have to have an elevated transmit power.
There is a logical and reasonable explanation. These companies are run by a bunch of sociopathic, unethical people who won't hesitate to lie and cheat if it gets them more money. It's as simple as that.
Here kitty kitty ...
[1]: https://android.googlesource.com/platform//frameworks/base/+...
If the flag is false, use reported level and default number of bins.
Since both numerator and denominator are bumped up, is it really malicious?
Based on this commit at least, personally, I feel such logic could be due to a decision to shift from levels starting from 0 to levels starting from 1 at the UI level.
Or perhaps to make levels consistent between different operators, some of whom were using 0-based while others used 1-based.
I haven't gone through later commits or latest versions. So my opinion's limited just to this original 2017 change.
2 bins out of 4 suggests 25%-50%
3 bins out of 5 suggests 40%-60%
Hm - what is the word 'INFLATE' doing there?
I would like to believe your opinion but that word INFLATE makes it hard ...
If it is a UI correction, then surely it would have had a different name: ENSURE_SIGNAL_IS_ONE_BASED ... ;)
(but I still would love to know how does one hide from git blame)
(file renaming?)
It’s showing full or near full bars even in places I can’t load light sites like hn properly.
Psychology tricks like these only work if you don’t overdo it
It's probably a reasonable pitch to say that phones should instead display something closer to "meaningful available bandwidth" crossed with strength, because a strong signal doesn't mean a good connection.
if we cared about signal strength, we'd make it part of the telecommunications regulated sphere: you must back your meter with a path which shows signal strength accurate to xDB in some yUnits of zQuality measured at one of A,B conforming labs
I wonder if there's some odd telecom history behind these, or if these flags were intended for some kind of edge-case. It seems like carriers have the option to arbitrarily override the thresholds used for determining signal strength[1], but only four carriers actually do. All only elect to customize the `lte_rsrp_thresholds_int_array` field; and all opt to make things harder for themselves, reporting their network connection as lower strength than the default classification[2] would:
[0]: https://developer.android.com/reference/android/telephony/Ca...[1]: https://source.android.com/docs/core/connect/signal-strength...
[2]: https://android.googlesource.com/platform/frameworks/base/+/...
https://www.theverge.com/2011/05/04/536673/att-t-mobile-dipp...
AT&T has a history of lying about what its network is. They were advertising HSPA+ as 4G and then recently started advertising LTE as "5G E". I can't find a lot of articles about the 4G branding one since the 5G one started.
> show_4g_for_lte_data_icon_bool
Realistically I think this is just a choice that many carriers made. It's quite common to see 4G instead of LTE outside of the US. Technically speaking I think WiMAX counted as 4G when there were competing 4G standards and you could make an argument that LTE is just one of the 4G standards.
Is the commit that added it.
Wayback machine link: https://web.archive.org/web/20251103013626/https://nickvsnet...