Tag Archive for Hardware

What is Quantum Computing?

What is Quantum Computing?The world of theoretical physics has been the domain of geniuses like Stephen Hawking and fictional characters such as The Big Bang Theory’s Sheldon Cooper. But now companies like Google (GOOG), IBM (IBM), and Intel (INTC) are building quantum computer systems, that may soon outperform even the fastest supercomputers in the world. So, it’s a good time to learn some basic quantum computing terms and concepts.

It’s physics

Quantum mechanicsQuantum Computing is based on Quantum Physics. Quantum Physics is the arm of modern physics that explains the nature and behavior of matter and energy on the atomic and subatomic levels. It is also called quantum theory and quantum mechanics. Quantum computers use quantum physics to compute.

Before quantum physics, “classical” physics explained the world around us (calculations of speeds, rotations, weights, forces …).  Then came Einstein who explained the “infinitely large”, the universe, time, big bang, black holes… But the classic mechanics did not explain everything and this is where quantum physics, steps in. The world of atoms, the infinitely small, does not work like the world that we, humans, see every day. The algebra story problems about a ball bouncing off a wall at 37 degrees and landing 43 feet away no longer apply in the world of quantum physics. Quantum computing devices use these newly discovered properties to perform computations using quantum bits, or qubits.

Classical computers

EinsteinPierre Pinna at IPFCOnline explains that the “classical” computer sitting on your desk, manipulates information (software, texts, pictures, videos, etc.). Inside your laptop, this information is made up of “1” and “0”. All computers have one (or more) micro-processors that manipulate the “0” and “1”, by applying the basic operations (addition, subtraction, multiplication) to “order” the 1’s and 0’s into software, texts, pictures, videos, etc.

The 1’s and 0’s are physically created by electric current inside transistors. Each transistor can be on or off, which indicates the 1 or 0 to be used to compute the next step in a program.

When the transistor is open, the electric current does not pass through the transistor and we say that we are in the state “0” and conversely if the transistor is closed, the electrical current can pass through it, we are in state “1”. The transistors inside the CPU can be combined into logic gates to perform logic operations like “OR”, “XOR”, “AND.” The classical computers 1’s and 0’s are called “bits.”

Quantum computers

Quantum bitsQuantum computers also handle “1” and “0” just like your laptop. But the information is no longer manipulated by transistors but by atomic and subatomic particles (electrons, protons, ions, photons, neutrons, etc.). You know, the stuff they taught in Mr. Birchmeier’s high school science class. Quantum computers don’t use bits; they use quantum bits (qubits). And that’s where quantum computing gets interesting – the subatomic world does not work like the physical world we live in.  Quantum physics explains how the subatomic world works.

Tristan Greene at TNW writes that qubits have extra functions that bits don’t. Instead of only being represented as a 1 or 0, qubits can actually be both at the same time. Mr. Greene writes that qubits, when unobserved, are considered to be “spinning.” Instead of referring to these types of “spin qubits” using ones or zeros, they’re measured in states of “up,” “down,” and “both.”

This lab at IBM houses quantum machines connected to the cloud.

The IPFCOnline article explains that to better understand all of this, we must see each particle as a wave and not a single physical element. The particles are then characterized by their “spin” to create a state called superposition.

Mr. Greene at TNW writes that quantum superposition in qubits can be explained by flipping a coin. We know that the coin will land in one of two states: heads or tails. This is how classical computers think. While the coin is still spinning in the air, the coin is actually in both states at the same time. Essentially until the coin lands, it has to be considered both heads and tails simultaneously.

Quantum computing use superposition

Observation theorySuperposition is based on Observation theory. Observation theory basically says the universe acts one way when we’re looking, another way when we aren’t. Mr. Pinna at IPFCOnline writes that with superposition, while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don’t look to check. To illustrate this theory, we can use the famous and somewhat cruel analogy of Schrodinger’s Cat using a cat in a box as being both alive and dead at the same time.

All of these sub-atomic activities make the quantum computer very sensitive to disturbances from the outside world. When quantum computers are disturbed they become unstable, and revert to “classical computers.” In order to keep the quantum properties of the system, it must be protected from the outside world. According to the article, this is typically done by cooling the quantum computer to temperatures very close to absolute zero (-273.145°C – colder than in space). Another factor when working with qubits is noise. The more qubits a system has, the more errors you get.

All of these factors make working with qubits incredibly difficult. These challenges are made worse by the unsustainable amount of electricity currently needed to generate quantum computing results. Reports are that one quantum computer burns about 20 megawatts of electricity — enough to power 20,000 households.

Therefore, the current state-of-the-art quantum computing theoretical speed gain is limited by the cost, size, and instability of the system. Right now, quantum computers aren’t worth the trouble and money they take to build and operate. A quantum computer is not going to run MS Word on your desktop.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Undersea Data Center

Updated 08/09/2019 – Microsoft has installed two underwater cameras that offer live video feeds of the sunken data center. You can now watch all kinds of sea creatures swimming around a tank that holds 27.6 petabytes of data.

Undersea Data CenterFollowers of the Bach Seat know that Microsoft (MSFT) has experimented with undersea data centers to save costs associated with deploying data centers. Back in 2015, I wrote about MSFT’s initial experiment off the California coast where MSFT first tried out the idea of an underwater data center. Redmond has announced phase 2 of Project Natick. Phase 2 of Project Natick is designed to test the practical aspects of deploying a full-scale lights-out data center underwater called, “Northern Isles.”

Undersea Data CenterKurt Mackie wrote in an article at Redmond Magazine that Microsoft is testing this underwater data center off the coast of Scotland near the Orkney Islands in the North Sea. Microsoft wants to place data centers offshore because about half the world’s population lives within 125 miles of a coast. Locating data closer to its users reduces latency for bandwidth-intensive applications such as video streaming and gaming, as well as emerging artificial intelligence-powered apps. Latency is the time it takes data to travel from its source to customers. It is like the difference between using an application on your hard drive vs. using off the network.

Mr. Mackle posts that the original underwater data center had the computing power of 300 PCs, Phase 2’s computing power is equal to “several thousand high-end consumer PCs,” according to Microsoft’s FAQ page. This next-generation underwater data center requires 240KW of power, is 40 feet in length, and holds 12 racks with 864 servers. The submarine container is mounted on a metal platform on the seafloor 117 feet deep. The Phase 2 data center can house 27.6 petabytes of data. A fiber-optic cable keeps it connected to the outside world. Naval Group, a 400-year old French company built the submarine part of the project.

The interesting part (U.S. Navy submarines have had computers onboard for years) is the lights-out part. Lights out allow Microsoft to change up how data centers are deployed. Northern Isles’s cooling techniques are changed. The cold-aisle temperature is kept at a chilly 54F (12C) to remove the stress temperature variations place on components. This temperature is maintained by using a heat-exchange process developed for cooling submarines. Ben Cutler, Microsoft Research Project Natick lead told Data Center Knowledge, “... by deploying in the water we benefit from ready access to cooling – reducing the requirement for energy for cooling by up to 95%.”

heat exchangerWith Phase 2, Mr. Cutler explained to DCK there’s no external heat exchanger, “We’re pulling raw seawater in through the heat exchangers in the back of the rack and back out again.” This cooling system could cope with very high power densities, such as the ones required by GPU-packed servers used for heavy-duty high-performance computing and AI workloads.

According to DCK the first iteration of Project Natick had a Power Usage Effectiveness (PUE) rating of 1.07 (compared to 1.125 for Microsoft’s latest-generation data centers). The lower the PUE metric, the more efficiently the data center uses electricity. Microsoft hopes to improve the PUE for the phase 2 data center.

off-the-grid tidal power.Data centers are believed to consume up to 3% of the world’s electricity. The new cooling options change up the Northern Isles data center power requirements. It can run off the Orkney Islands’ local electrical grid which is powered by renewable wind, solar and tidal sources. One of the goals of the project is to test powering the data center with an off-the-grid source, such as using nearby tidal power.

Future versions of the underwater data center could also have their own power generation. Mr. Cutler told DCK, “Tide is a reliable, predictable sort of a thing; we know when it’s going to happen … Imagine we have tidal energy, we have battery storage, so you can get a smooth roll across the full 24-hour cycle and the whole lunar cycle.”

This would allow Microsoft to do away with backup generators and rooms full of batteries. They could over-provision the tidal generation capacity to ensure reliability (13 tidal turbines instead of 10, for example). Mr. Cutler says, “You end up with a simpler system that’s purely renewable and has the smallest footprint possible.”

 Northern Isle underwater data centerThe Northern Isle underwater data center is designed to run without being staffed. This fact cuts down on human errors. It is designed with a “fail-in place” approach where failed components are not serviced, they are just left in place. Operations are monitored by artificial intelligence. Mr. Cutler said, “There’s a lot of data showing that when people fix things they’re also likely to cause some other problem.

By operating in ‘lights out’ node with no human presence, allows most of the oxygen and water vapor to be removed from Northern Isles’ atmosphere. MSFT replaced Oxygen with 100% dry nitrogen. This environment should greatly cut the amount of corrosion in the equipment, a major problem in data centers on land.  Mr. Cutler told DCK, “With the nitrogen atmosphere, the lack of oxygen, and the removal of some of the moisture is to get us to a better place with corrosion, so the problems with connectors and the like we think should be less.

The Redmond Magazine article says Project Natick’s phase 2 has already proved that it’s possible to deploy an underwater data center in less than 90 days “from the factory to operation.” The logistics of building underwater data centers are very different from building data centers on land. Northern Isles was manufactured via a standardized supply chain, not as a construction process.  Mr. Cutler said, “Instead of a construction project, it’s a manufactured item; it’s manufactured in a factory just like the computers we put inside it, and now we use the standard logistical supply chain to ship those anywhere.

standard ISO shipping containerThe data center is more standardized. It was purposely built to the size of a standard ISO shipping container. It can be shipped by truck, train or ship. Naval Group shipped Northern Isles to Scotland on a flatbed truck. Mr. Cutler told DCK, “We think the structure is potentially simpler and more uniform than we have for data centers today … the expectation is there actually may be a cost advantage to this.”

The rapid time to deploy these data centers doesn’t only mean expanding faster, it also means spending fewer capital funds. Mr. Cutler explained, “It takes us in some cases 18 months or two years to build new data centers … Imagine if instead … where I can rapidly get them anywhere in 90 days. Well, now my cost of capital is very different … As long as we’re in this mode where we have exponential growth of web services and consequently data centers, that’s enormous leverage.

rb-

If Project Natick stays on the same trajectory, MSFT could bring data centers to any place in the developed or developing world without adding more stress on local infrastructure. MSFT’s Cutler told DCK “There’s no pressure on the electric grid, no pressure on the water supply, but we bring the cloud.”

As more of the world’s population comes online, the need for data centers is going to skyrocket, and having a fast, green solution like this would prove remarkably useful.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

PC’s Meh

PC's MehWe are almost midway through 2018 Q2 and the 2018 Q1 PC sales numbers were meh. The good news is that IDC called the PC market flat. That’s good news because they had predicted a 1.5% decrease for the quarter. IDC reports worldwide 60.4 million PC’s sold in the January-to-March period driven mostly by businesses moving to Windows 10. 

PC market experienced a 14th consecutive quarter of declineGartner (IT) is less meh and more blah. Gartner saw slightly more PC’s shipped in 2018 Q1 at 61.7 million units for a 1.4% decline. The PC market experienced a 14th consecutive quarter of decline, dating back to the second quarter of 2012.

Gartner Principal Analyst Ms. Mikako Kitagawa affixed the blame primarily to the Chinese market. “The major contributor to the decline came from China, where unit shipments declined 5.7 percent year over year.” Ms.Kitagawa continued, “This was driven by China’s business market, where some state-owned and large enterprises postponed new purchases or upgrades, awaiting new policies and officials’ reassignments after the session of the National People’s Congress in early March.”

Dell logoThe top three Gartner vendors — DellHP, and Lenovo — accounted for 56.9% of global PC shipments in Q1 of 2018. Up slightly compared with 54.5% of shipments in Q1 of 2017. Dell experienced the strongest growth rate among the top six vendors worldwide, as its shipments increased 6.5%.

HP‘s (HPQ) worldwide PC shipments increased 2.8% in the first quarter of 2018 versus the same period last year. In EMEA, HP Inc. recorded double-digit growth in both desktop and mobile PCs. Gartner says HP Inc. was adversely affected by declining demand in the U.S., which generally accounts for one-third of its total shipments.  

Lenovo’s (LNVGY) global PC shipments remained flat in the first quarter of 2018. Lenovo achieved 6 percent growth in EMEA and double-digit shipment growth in Latin America. However, in Asia/Pacific (its largest market), PC shipments declined 4 percent.

After record holiday sales for consumer and gaming products in the fourth quarter of 2017, Dell continued to do well in the first quarter of 2018. With double-digit shipment increases in EMEA, North America, and Latin America, Dell grew in all regions except Asia/Pacific. Desktop and mobile PCs grew in equal measures, showing Dell’s strength in the business segment according to Gartner.

HP logoIn the U.S., PC shipments totaled 11.8 million units in the first quarter of 2018, a 2.9% decrease from the first quarter of 2017 according to Gartner. Dell moved into the No. 1 position in the U.S. based on shipments, as its market share increased to 29.1%. HP Inc. moved into second place as its shipments declined 4.8%, and its market share totaled 28.4%in the first quarter of 2018.

2018 Q1 - Gartner Global PC Shipments

Company2018 Q1 Shipments2018 Q1 Market Share (%)
Dell3,44029.1
HP Inc.3,36328.4
Lenovo1,63213.8
Apple1,49112.6
Acer Group3212.7
Others1,58613.4
Total11,833100.0
Notes: Data includes desk-based PCs, notebook PCs and ultramobile premiums (such as Microsoft Surface), but not Chromebooks or iPads. All data is estimated based on a preliminary study. Final estimates will be subject to change. The statistics are based on shipments selling into channels. Numbers may not add up to totals shown due to rounding.. Thousands of Units.Source: Gartner (April 2018)

PC shipments in EMEA totaled 18.6 million units in the first quarter of 2018, a 1.7% increase. driven by Enterprise shipments increased as many Windows 10 projects and the fast approach of the compliance deadline for the General Data Protection Regulation (GDPR) in Europe.

PC shipments in Asia/Pacific totaled 21.9 million units in the first quarter of 2018, a 3.9% decline from the first quarter of 2017. As previously mentioned, the PC market in China drove the decline in Asia/Pacific.

IDC says the U.S. market saw a promising opening quarter for the year with almost all major vendors reporting increases in notebook sales. Overall, total PC shipments for 2018 Q1 stood at 13.5 million units.

IDC reports that HP Inc. maintained a comfortable lead over all others in the market with its eighth consecutive quarter of overall growth (up 4.3% year on year) and growth in all regions except Latin America.

Lenovo saw a flat quarter in 2018 Q1, the third consecutive quarter in which the company saw year-on-year volume stabilize with flat global growth and a slower pace of decline in the U.S. Dell Inc. posted the strongest year-on-year growth out of all the major companies, growing 6.4% and buoyed by strong performances in nearly every region.

Acer (TPE:2353) held onto fourth place. Its ongoing expansion into gaming and continued investments in Chromebooks have paid dividends for the company but also caused some tough going in other areas. Apple (AAPL) finished the quarter in fifth place with a year-on-year decline in shipments of 4.8%.

2018 Q1 - IDC Global PC Shipments

Company2018 Q1 Shipments2018 Q1 Market Share (%)
HP Inc.13,67622.6
Lenovo12,30520.4
Dell Inc.10,19016.9
Acer Group4,0856.8
Apple4,0006.6
Others16,12826.7
Total60,383100.0
Preliminary results. Shipments are in thousands of units. Source: IDC Quarterly Personal Computing Device Tracker, April 11, 2018

rb-

PC’s used to be a leading indicator of the health of the tech sector. That is not the case anymore. Economic stress has lengthened the life span of PCs from 3 years to nearly 5 years in many firms and even longer in the home market. Increased smartphones capability and cloud-based applications and storage have taken another bite out of the PC market.

But looking into the tea leaves, many think PCs are on the rebound. Driving the PC market is a demand for premium notebooks in the mainstream and commercial markets. Gaming systems are also part of the equation. IDC expects overall smartphone shipments to decline by 0.2% in 2018 after falling 0.3% last year, the thought is that those dollars would be used to upgrade their PCs.

Mmmm – we’ll see. I say not likely. Can you say “new normal?”

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

A Printer for Rocket Scientists

A Printer for Rocket ScientistsWe all dream about the elusive paperless office. Seems even rocket scientists can’t figure it out. Mashable is reporting that the rocket scientists aboard the International Space Station (ISS) research laboratory which orbits 254 miles above Earth and travels at more than 17,500 miles per hour print a lot. The astronauts print roughly 1,000 pages a month on two printers; one is installed on the U.S. side of the ISS, the other in the Russian segment. They print critical mission information, emergency evacuation procedures, and sometimes, photos from home on a 20-year-old printer.

The international space station is one of humanity's great engineering triumphs. washingtonpost.comNASA IT techs just ordered new printers for the International Space Station (ISS) to replace the Epson 800 Inkjet printers which have been on-board the ISS since the people moved in, in November of 2000. ISS told the author, “When the printer was new, it was like 2000-era tech and we had 2000-era laptop computers. Everything worked pretty good … the printer’s been problematic for the last five or six years.”

Stephen Hunter, Manager of ISS Computer Resources, called the Epson 800 Inkjet printer, “a museum piece.”  NASA had dozens of this printer and, as one failed, they’d send up another one.

Epson 800 Inkjet printerBut now it’s time for something new. In 2018, NASA will send two brand new, specialized printers up to the station. Mr. Hunter, who has been updating the ISS’s office technology for the last two years, told Mashable that the ISS printers have needed to be replaced for a long time. However, he can’t drive over to Best Buy, buy a new printer, and launch it into space.

He started working with HP (HPQ) on an ISS IT overhaul, replacing over 100 existing ISS workstations with HP Gen 2 Z-Book laptops for the crew, so it was only natural they would turn to HP again for the printer project. Enrique Lores, President of HP’s Imaging, Printing, and Solutions business welcomed the opportunity, “We couldn’t pass up the opportunity to do this … It was an incredible technical challenge.”

By Hewlett-Packard Company [Public domain], via Wikimedia CommonsHP couldn’t just suggest that NASA launch any ordinary laser printer into space. Its friable toner dust and significant power consumption would make it a poor fit for life in micro-gravity. Ronald Stephens Research and Development Manager for HP’s Specialty Printing Systems Division explained, “NASA had a very unique set of requirements that we had to meet.”

NASA wanted a printer that could:

• Print and handle paper management in zero gravity – On Earth printers rely on gravity for paper management. Whatever HP provided would have to hold the paper, so it didn’t jam in the printer or float away when the printer’s done with it according to Mashable.

NASA• Handle ink waste during printing – NASA’s Hunter explained that typical inkjet printers do deposit some extra ink during the printing process. With gravity in place, the ink typically stays in the printer or even on the printed sheet. In zero gravity, it floats out. The NASA IT expert said astronauts could ingest the ink or it could contaminate the crew’s numerous onboard experiments.

• Be flame retardant – HP replaced the printer’s shell with fire-retardant plastic.

• Be power-efficient – The ISS generates all its own electricity through solar panels. That means they must tightly manage power consumption. The article says any new device they bring on board must be power efficient. One bit of good news: HP doesn’t have to change the power configuration on the printer. The ISS can supply a standard 110 AV outlet.

Instead of building a specialized printer from scratch. HP recommended the HP Envy 5600. It’s a standard, all-in-one device you can buy at retail for $129.99. But the printers heading up to the ISS underwent significant modification.

We removed the capability to do scanning, fax, and copy out of it to reduce weight and remove glass portions,” said NASA’s Hunter.

Removing what could weigh the printer down or break and become a space disaster was only the start. The most challenging part was related to zero gravity. Ultimately, HP went through every printer system and component to analyze how it would be affected by zero gravity.

HP turned to 3D printing and developed, experimental 3D material — nylon filled with glass beads. Its unique properties allowed HP to swap out the multiple parts that make up the printer output tray and turn it into one that’s both lighter, flexible, and more reliable.

HP ISS PrinterAfter all the modifications, the HP space printer still looks like a printer. It’s 20 inches wide, 16 inches deep, and five inches high. There’s no lid or glass, but, aside from the 3D printed materials, the ISS’s next printer looks pretty unremarkable. The HP ENVY Zero-Gravity Printer still uses standard inkjet ink.

To work out the kinks of the new ISS printer, HP worked with a small team from NASA that included Pettit and three other astronauts. Astronauts’ concerns about printing in space are much the same as they are on the ground. “You want it to be uneventful… you want to hit print and have a hard copy,” said Pettit.

The Vomit Comet flies a parabolic flightUp to this point, all of NASA and HP’s work was theoretical. They did all they could to make the space printer space-ready. However, the only way to know if this printer is suitable for use on the space station before actually sending it to space is by testing it in zero gravity and the only way to do that is on NASA’s Vomit Comet.

The Vomit Comet is a plane that flies a parabolic flight. As it loops up and down, passengers achieve, at the peak of the curve, about 20 seconds of near-weightlessness. During those times, the team tested printing and that the paper flowed through the printer and ejected in the right way. NASA’s Hunter said, “It went flawlessly. Everything works to our expectation.”

By SpaceX (transferred from English Wikipedia) [Public domain], via Wikimedia CommonsNASA plans to send the first two printers up to the station on Elon Musk’s Space-X Dragon C16 rocket as part of Space X mission CRS-14 scheduled for launch in February 2018.

NASA and HP have retrofitted roughly 50 HP Envy printers and expect each one to last roughly two years. “We want to use this through the remainder of the ISS program. Officially through 2024, with plans through 2028,” said NASA’s Hunter.

This will be the last printer they get in the space station,” predicated HP’s Stephens.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Is Your Battery Healthy?

Li-ion batteries need very different care and feedingLithium-ion batteries have been in the news lately for causing fires and explosions. Explosions have happened with e-cigarettes, hoverboards, and the Samsung Galaxy Note 7, which was banned from all flights by the FAA due to its explosion risk. Despite the risks, most of today’s most popular gadgets have a battery.

Is Your Battery Healthy?Fred Langa at Windows Secrets Newsletter posted an article on how to get the most out of the lithium-ion (Li-ion) batteries. Li-ion batteries need very different care and feeding than the nickel-cadmium (Ni-Cd) and nickel-metal-hydride (Ni-MH) batteries used in earlier devices. Proper care of a Li-ion battery can result in as much as 15 times longer service life than with an improperly cared-for battery.

Steps to extend the battery service life

The article does not cover ways to get more run time between recharges; those techniques are already well-known. Most portable devices offer ample manual and automatic power-saving modes and methods such as adjusting screen brightness, slowing CPU speed, and reducing the number of apps running.

15 times longer service lifeRather, the article focused on ways to extend the battery’s overall service life. Follow these five important tips, and you’ll help make sure that your Li-ion batteries will deliver long, full, safe service lives in your new portable devices.

Keep your lithium batteries cool

Heat is the number-one enemy of Li-ion batteries. Heat issues can be caused by usage factors such as the speed and duration of battery charging and discharging. The physical environment also matters. Simply leaving your Li-ion powered device in the sun or in an enclosed car, even if the device isn’t being used, can significantly reduce the battery’s ability to take and hold a charge according to the article.

battery capacity can be reduced by 50%.Li-ion batteries perform best at about normal room temperature (68F/20C). If the device warms to 86F/30C, its ability to hold a charge drops by about 20%. Mr. Langa says if the battery is used at 113F/45C — a temperature easily reached by devices that are working hard or that are in the sun, battery capacity can be reduced by 50%.

So if your device or battery becomes noticeably warm while you’re using it, the article recommends moving to a cooler place. If that’s not possible, try reducing the amount of power the device is using by turning off unneeded apps, reducing screen brightness, or activating the device’s power-saving mode.

Of course, you can turn the device fully off until its temperature returns to normal. For fastest cooling, remove the battery, if possible Windows Secrets recommends. The battery and the device will cool off faster if they’re physically separated according to the article.

Low temperatures aren’t as much of a worry. Low temps usually won’t cause any long-term damage, although a cold battery won’t produce as much power as it otherwise would. The power drop becomes very noticeable at temperatures lower than about 40F/4C. Most consumer-grade Li-ion batteries are essentially useless at temperatures below freezing.

If your Li-ion powered device becomes excessively chilled for any reason, don’t try to use it. The article says to leave it powered off and move it to a warm place until the device is at normal temperature. Once the battery warms to a normal temperature, so will its electrical performance.

Unplug the charger

Overcharging, leaving a battery connected to a too-Unplug the chargerhigh voltage source for too long, can reduce a Li-ion battery’s ability to hold a charge, shorten its life, or kill it outright according to the author. Most consumer-grade Li-ion batteries are designed to work at around 3.6 volts per cell but will accept a temporary overvoltage of around 4.2 volts while charging. Mr. Langa warns that if a charger outputs the higher voltage for too long, internal battery damage can occur.

In severe cases, Windows Secrets warns that overcharging can lead to what battery engineers delicately refer to as “catastrophic failure.” Even in moderate instances, the excess heat produced by overcharging will negatively affect battery life, as you saw in Tip #1.

High-quality chargers can work in concert with circuitry inside well-designed Li-ion-powered devices and their batteries, reducing the danger of overcharging by properly tapering off the charging current. The article says the simplest, can’t fail method is not to leave your Li-ion devices connected to any charger longer than needed.

These properties are quite different from those of older Ni-Cd and Ni-MH battery technologies, which did best when left on their chargers for as long as possible. That’s because those older battery types have a high rate of self-discharge; that is, they start losing a significant amount of stored energy the moment you take them off the charger, even if the device they power is turned off.

In fact, a Ni-Cd battery can self-discharge at a rate of 10% in the first 24 hours. The self-discharge curve flattens after that, but a Ni-Cd battery will still lose an additional 10–20% charge per month. Ni-MH batteries are even worse. Their self-discharge rate is about 30% higher than that of Ni-Cd.

But Li-ion batteries have a very low rate of self-discharge. A healthy, full, lithium battery will self-discharge at about only 5% in the first 24 hours off the charger — with only 2% or so per month after that.

It’s simply not necessary to leave a Li-ion device on the charger until the last possible moment. For best results and the longest battery life, unplug the charger when it or the lithium-powered device shows a full charge.

It’s also not necessary to give new Li-ion devices an extended charge before first use. (Ni-Cd or Ni-MH devices used to come with warnings to do an initial charge of anywhere from 8 to 24 hours.) Li-ion batteries are fully ready for use when the charger or the device reads 100% charge. No extended charging is needed.

Don’t deep-discharge your battery

Not all discharge cycles exact the same toll on a battery. Long and heavy usage generates more heat, putting more stress on the battery; smaller, more frequent discharges extend the overall life of lithium batteries.

Don't deep-discharge your batteryYou might think that a higher number of small discharge/recharge cycles would eat into the battery’s overall lifespan. That was true with older technologies, the author says it’s not the case with Li-ion. Battery specs can be confusing because most manufacturers count a full Li-ion charge cycle as whatever it takes to add up to a 100% charge. For example, three 33% discharge/recharge cycles equal one full-charge cycle, five 20% cycles equal a full charge, and so on.

In short, a higher number of small discharge/recharge cycles doesn’t reduce a lithium battery’s total available full-charge cycles.

Again, heat and stress from heavy discharges cut battery life. So try to keep your deep-discharge events to a minimum. Mr. Langa recommends that you don’t let your device routinely run down to zero charge (where the device turns itself off). Instead, think of the bottom 15–20% of battery capacity as a reserve — for emergency use only. Get into the habit of swapping in a fresh battery (if possible) or plugging the device into external power well before the battery is empty.

Slow and steady is best

Slow and steady charge/discharge is bestAs you now know, both fast discharging and fast recharging generate excess heat and exact a toll on battery life. Windows Secrets says if you’ve run a device long and hard, let the battery cool to room temperature before recharging it. Batteries won’t accept a full charge when hot. And when recharging, make sure your charger doesn’t make the battery become hot to the touch, a hot battery is a sign the charger is pumping too much current, too fast, through the battery.

Overcharging is more likely with chargers that are cheap, off-brand models; that use fast-charge circuitry; or that are wireless (inductive). These “dumb” chargers simply pump out current, accepting little or no feedback from the device being charged. Overheating and overvoltages can easily occur, damaging or even destroying the battery.

Fast chargers provide a useful charge to a drained battery in minutes and not hours. The author explains there are various approaches to fast-charging technology, and not all of them are compatible with all lithium batteries. Unless the charger and the lithium battery are specifically designed to work together, fast charging could cause overheating and overvoltages. Generally, it’s best not to use one brand of fast charger on a different brand’s device.

Wireless (inductive) chargers use a special charging mat or surface to restore a battery’s power. It sounds wonderfully convenient, but inductive charging always generates excess heat, even when it’s working normally.

Not only is the excessive heat produced by a wireless charger not good for lithium batteries, it also wastes energy. By its nature, inductive charging’s efficiency is always going to be lower than a standard charger’s. Mr. Lunga says that higher heat and less efficiency easily outweigh convenience.

In any case, the safest approach is to use only chargers sold by the OEM of your lithium-powered device. It’s the only way to be sure that the charger will keep temperatures and voltages within specs. The article recommends that if a OEM charger isn’t available, use a low-output charger that’s unlikely to pump damaging amounts of power into the device you’re charging.

One source of low-output, non-OEM charging that’s often available is the USB port on a standard PC. A typical USB 2.0 port provides 500mA (.5 amps) per port; USB 3.0 provides up to 900mA (.9 amps) per port. In contrast, some dedicated chargers will output 3,000-4,000mA (3-4 amps). The low amperage offered by USB ports will usually provide cool, safe charging of almost any Li-ion device.

If possible, carry a spare battery

carry a spare batteryIf your device allows for easy battery replacement, carrying a spare battery is cheap insurance. It will give you twice the run time. When the in-use battery approaches 15–20% charge, simply swap out the drained battery for a fresh, cool one — you get instant full power, with no heat worries.

A spare battery also allows for other benefits. For example, if you find yourself in a situation where the installed battery is running hot, you can swap out the hot battery to let it cool. Having two batteries should also eliminate any need to use fast chargers — you can charge the spare at a safe, slow rate while the other is in use.

rb-

For more tips on how to keep your Apple iPhone battery in tip-top shape, check out this post from 2014.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.