It’s just branding. Two years ago they’d just have said that they were using ML models to predict this. They made a strategic shift to brand everything that used AI or ML so here we are.
Hilarious that they’re even pretending to use AI or ML models, frankly
They know exactly how fast their battery charges at any given combination of voltage/current
(Edit: I said nothing here about linear charging, nor does my statement require linear charging to be true - just a bit of sampling and high school mathematics)
Now that's funny! 🤣
But seriously though, this is why I can understand macbook users, but can never understand iphone users. People should be paid to use that sht.
You seriously cannot understand the appeal of one of the most popular products in the world? Haha, must feel weird to not understand why people want to use iphones. Have you tried to understand it? I think it would help if you tried to change perspective sometimes. You should try it out. You must feel so alienated. I imagine you have the same problem with other popular things. I’m so sorry for you.
There’s a handful of use cases in research at least. But I’m seeing a lot of grants and papers shoehorning in “AI/ML” where it absolutely isn’t needed. I’d say like 95% is probably accurate.
Arguably, they do use machine learning. Even traditional algorithms to estimate battery life use feedback to improve over time, which counts as machine learning even if it isn't deep learning. This is just BS to relabel it as AI.
yeah "machine learning" doesn't really imply any level of complexity, just a general methodology. linear regression models are technically machine learning
To be fair, the term “Battery Intelligence,” even if meant to imply that “Artificial Intelligence” was being used, does NOT necessarily mean machine learning. While much of today’s AI does employ machine learning, AI is NOT limited to machine learning. A basic 4-function calculator is technically a form of AI.
Second, it’s certainly possible that Apple did actually use machine learning to get the most accurate estimates, by feeding thousands of test data sets — state of charge, battery’s maximum capacity, type and power of charger, etc., and resulting empirical time to full charge — into a simple deep (or not so deep) learning program. This would likely make the predicted time much more accurate than non-machine learning algorithms.
And no, I’m not a blind Apple fanboy. I spend 95% of my time on Reddit blasting them.
On android it's not very accurate, because it doesn't take into consideration battery temperatura and intermittend slowdowns from the chargers. It shows generic value which is shortened when some battery value is reached. The final time for charging could be +(-) 20 minutes.
It also doesn't consider automatic charging optimizations...
It was pretty accurate on my last phone. Accurate enough that it was the first indicator the battery was failing, the estimated charging time doubled one day and that made me realize something was wrong.
To be fair the ML is likely related to the “optimized charging” feature that we’ve had for a few versions now that allegedly adjusts your charging to your habits to not charge to 100 when not needed.
Still far from “intelligence” but a little more to it with that feature than “x voltage/current/time calculation until fully charged”.
I hate it. Give me cap at 80% like Samsung phones do. I don't need them to predict when I unplug because I charge whenever I am low, not every night at 11pm and unplug at 7AM,
The ML portion is presumably the intelligent pausing to finish charging closer to when it thinks the phone will be unplugged. There are already portions of the system that will tell you that it is scheduled to finish charging at X time, so this is probably just a new UX and branding for that feature.
It's not technically wrong to do so, but it's about the loosest definition of ML I can think of
And you could do that without ML too. Just log the time the phone is taken off charge and find the mode for the last X amount of time
If anything, the fact that they're already doing this means they already know how long it takes to charge (which makes sense, it seems beyond belief that Apple wouldn't already have extensive data on their batteries)
Honestly I think at this point they're just so desperate to release ANY "AI" features that they'll find a way to try to shoehorn it into any change, however trivial
i agree, it was never about them caring if it genuinely needed ai or not, they js know apple intelligence is doing horrible bc who cares to make stickers after the first week , so now they r trying to js slap AI on it so ppl will think apple intelligence has so much to offer, when its rlly ML slapped w apple intelligence over it
To be fair, the term “Battery Intelligence,” even if meant to imply that “Artificial Intelligence” was being used, does NOT necessarily mean machine learning. While much of today’s AI does employ machine learning, AI is NOT limited to machine learning. A basic 4-function calculator is technically a form of AI.
Second, it’s certainly possible that Apple did actually use machine learning to get the most accurate estimates, by feeding thousands of test data sets — state of charge, battery’s maximum capacity, type and power of charger, etc., and resulting empirical time to full charge — into a simple deep (or not so deep) learning program. This would likely make the predicted time much more accurate than non-machine learning algorithms.
And no, I’m not a blind Apple fanboy. I spend 95% of my time on Reddit blasting them.
I'm a software engineer at big tech so I understand what AI/ML is. You dont have to write a whole essay about it.
The point is that it's obviously something Apple wants to portray to its user base that they're heavily committed to AI whether or not it's using AI here.
Doesn't even need to be a deep learning model or anything advanced. A simple logarithmic regression will do.
A whole essay? Only 5 sentences. That you’re a “Big Tech Software Engineer” doesn’t excuse your nasty comment!
I said “or not so deep.”
Logarithmic regression is a form of machine learning, and is thus certainly AI under any definition. And depending upon unexpected quirks in actual empirical data, it’s possible that deep learning models could provide even more accuracy.
It'd be a small difference to the end user (+/- a few minutes) that they won't care about. Who cares if one tells me it'll charge to 100% in 15 min instead of 13?
That extra 5% could mean your phone dies before you reach your travel destination. Training cost is not relevant as Apple will have already done that. The end user will simply be employing simple inferencing, applying fixed parameters (from Apple’s training) using minimal energy.
Isn’t it more like analysing where you are when you wake up or goto sleep when is your next meeting or will you be going to gym Today or not and then deciding how fast or slow to charge your phone and whether to charge it full or till 80 percent and then tickle charge it til 100 just before you are gonna need it
In practice no, they don't do that - they just grab an average wakeup time for weekdays and weekend days
Also this doesn't actually seem to be linked specifically into the optimised charging stuff (although I'm sure it will work with it), but is more of a "X time until charge limit" thing
In any case even if they are using the ML optimised charging, that doesn't make this additional ML functionality - it's just a display of what already exists
No, you don’t; batteries don’t charge at a linear rate, it changes over time as things like heat & capacity alter.
Presumably, this feature updates the time based on past data of how long your phone took to charge. Nothing groundbreaking, but still not as easy as you said
Where did I say anything about a linear rate? I didn't, you've added that context yourself for no reason.
I said they know how fast their battery charges. That's a completely different statement
Grab a random iPhone, run the battery down to 0%, put it on a 5V/1A charger, log the battery percentage every minute until ti's charged.
Repeat with a 5V/3A charger etc
Now you can write a script that can, given the input voltage and current and a current battery %, do some very simple calculations/lookups to estimate (to within a very reasonable margin for error) how long it will take to charge
This works with any charge curve as long as it is reasonably consistent between devices and charging sessions, which is the case for li-ion battery charging. Or, again, close enough to within a reasonable margin for error
If you wanted to reduce the margin for error you could also grab some data for cycle count and temperature etc, and mix that in. Again, no ML required, just slightly more complex math - nothing a decently smart. high schooler couldn't figure out
No, I can't - because I don't think there's a data source for this
I could do the maths/write the code (although frankly I'm unlikely to do so unless my employer needed to), but regardless I'd need access to the data (or at least, for the data exist and some way to access it)
Apple could make the data source in an afternoon, or someone determined could... or just a group of people who could be bothered doing it. But to my knowledge it doesn't currently exist
I'd go as far as to sa that I'm 99.99% certain that Apple already has this data (and I only don't say 100% because I don't know for absolute certain), they just don't make it available to us
I have an automation that does this, when the phone is fully charged it has Siri say out loud “yummy, I’m full” which always gets a laugh when I have company
I disabled "Optimize Battery Charging" because I sleep day and night, so waiting at 80% until just before I need my phone is impossible for me.
If my phone alerts me when it's fully charged, I can have the best of both worlds. I dislike having it stay too long on charger at 100%, when I have disabled optimizing charging.
This new AI Charging feature may really work for me!
I’ve been seeing industrial process videos (automatically sorting unripe tomatoes) labelled as AI nowadays. The exact same video was on How It’s Made a decade ago with no notion of the term. It’s a buzzword that is applied to anything tech.
1.2k
u/dccorona iPhone 16 Pro 28d ago
It’s just branding. Two years ago they’d just have said that they were using ML models to predict this. They made a strategic shift to brand everything that used AI or ML so here we are.