Specifically, how quickly does total life capacity degrade with deeper cycling?
Assumptions: (1) Phone use is not “extended mission” - can recharge several times daily w/o inconvenience. (This is a lot truer with inductive chargers!) And (2) Batteries do not have much memory, so shallow cycling will not compromise full charge/deep discharge when needed.
Most sites/posts make linear correlations like half the depth allows twice the number of cycles. What I (and many) would like to know is how the total lifetime charge that can be extracted from a battery (say, before it drops to 50% charging capacity) is affected by depth of charge/discharge; more specifically, what high and low charge points give near-optimal lifetime and convenience - e.g. 80% of best possible total charge with twice-daily partial charging.
An EE friend in the industry told me in the early days of hybrid autos that Prius etc. optimized both battery life and energy efficiency by tageting charge states between 50-85%. This allowed headroom for regenerative braking and still enough bottom capacity to start the petrol engine and drive in traffic a bit without the engine. I have no idea if this was accurate then or applies now; battery technology has evolved, and normal cell phone use does not need to accommodate high dis/charge currents.
So - can anyone please provide info, or a reference, for optimal charge/discharge ranges that give perhaps a half-day’s moderate-intensity use (talking not streaming, near cell towers, normal temperatures etc.) and maximize battery life?