Tag Archive for GOOG

Will Climate Change Sink the Web?

Despite claims to the contrary, climate change is real. Climate change will break critical parts of the Internet within 20 years. That is what a study by Paul Barford, a University of Wisconsin, Madison professor of computer science predicts.

Professor Barford presented his findings at IETF 102. IETF 102 was a meeting of the Internet Engineering Task Force Association for Computing Machinery, the Internet Society, and the Institute of Electrical and Electronics Engineers in Montreal. The study, “Lights Out: Climate Change Risk to Internet Infrastructure,” found that critical communications infrastructure could be submerged by rising seas in as soon as 15 years.

Conventional copper and fiber optic cables

Companies like Google, Microsoft, Facebook, and Cable and Wireless go through enormous costs and efforts to protect undersea cable spanning the continents but once that cable hits the shore it gets converted to conventional cables. The conventional copper and fiber optic cables buried decades ago, carry the signals from the landing points to the interior are not designed to withstand the inundation by saltwater caused by climate change.

Internet landing points that will be impacted by climate change

Popular Science reports that Professor Barford’s research found that climate change will impact more than 4,000 miles of buried fiber optic conduit. These conduits and internet cables will most likely be underwater and become inoperable due to exposure to damaging saltwater. Saltwater causes damage to the cables which reduces their ability to send signals. The cable landing stations where undersea cables connect the U.S. Internet to the rest of the world will also be vulnerable. The study also predicts that water will surround over 1,100 traffic hubs.

Unsersea fiber optic cable landing point susceptible to flooding

Major interruptions

Mr. Barford told Popular Science that this service interruption is likely to become a growing problem within the next 15 years. He warned that communications companies should begin implementing protective measures and solutions soon if they want to avoid major interruptions in the near future.

“Most of the damage that’s going to be done in the next 100 years will be done sooner than later,” says Dr. Barford, the keeper of the Internet Atlas, a comprehensive repository of the physical Internet — the buried fiber optic cables, data centers, traffic exchanges and termination points that are the nerve centers, arteries, and hubs of the vast global information network. “That surprised us. The expectation was that we’d have 50 years to plan for it. We don’t have 50 years.” He also notes “The landing points are all going to be underwater in a short period of time.”

The study is the first risk assessment of the impact of climate change on the U.S. infrastructure of the Internet. It reports that Miami, New York, and Seattle are among the areas where connectivity could be most affected. The Internet in these cities is at risk because cables carrying it tend to converge on a few fiber optic strands that lead to large population centers.

Fiber optic cable conduit susceptible to floodingBut the effects of climate changes would not be confined to those areas and would ripple across the Internet, potentially disrupting global communications. Many of the conduits at risk are already close to sea level and only a slight rise in ocean levels due to melting polar ice and thermal expansion will expose buried fiber optic cables to seawater.

No thought was given to climate change

Much of the infrastructure at risk is buried and follows long-established rights of way, typically paralleling highways and coastlines. The roots of the danger emerged inadvertently during the Internet’s rapid growth in the 1980s before there was widespread awareness of the Internet as a global grid or the massive threats of climate change. Professor Barford says, “When it was built 20-25 years ago, no thought was given to climate change.”

To reach this conclusion, the team combined data from the Internet Atlas and projections of sea level incursion from the National Oceanic and Atmospheric Administration (NOAA).

Fiber optic cableScience Daily says the findings of the study, serve notice to industry and government. “This is a wake-up call. We need to be thinking about how to address this issue.Mikhail Chester, the director of the Resilient Infrastructure Laboratory at the University of Arizona told National Geographic, This new study “reinforces this idea that we need to be really cognizant of all these systems because they’re going to take a long time to upgrade.

ISP responses to climate change

The impact of mitigation such as sea walls, according to the study, is difficult to predict. “The first instinct will be to harden the infrastructure,” Professor Barford says. “But keeping the sea at bay is hard. We can probably buy a little time, but in the long run, it’s just not going to be effective.”

US shore susceptible to flooding

The study called individual internet service providers. They found finding that AT&T (T), Verizon (VZ), and CenturyLink (CTL), at most risk. In response, AT&T spokesman Jeff Kobs told NPR,

AT&T uses fiber optic cable “designed for use in coastal areas as well as being submerged in either salt- or fresh-water conditions,… In certain locations where cabling will be submerged for long periods of time or consistently exposed, such as beaches or in subways, we use submarine underwater cabling.

Verizon spokeswoman Karen Schulz told NPR,

After Sandy, we started upgrading our network in earnest, and replacing our copper assets with fiber assets … Copper is impacted by water, whereas fiber is not. We’ve switched significant amounts of our network from copper to fiber in the Northeast.

She explained that Verizon’s focus on flood risk

really has less to do with sea-level change and more to do with general flooding concerns … For cable landing stations that are very close to the oceans and that have undersea cables, we specifically assess sea-level changes.

A representative of CenturyLink told Popular Mechanics they can handle the problem. The company’s PR rep said that CenturyLink networks are designed with redundancy and can divert traffic to alternate routes when infrastructure goes down.

rb-

Donald Trump Still Doesn’t Believe in Climate ChangeThe Verizon and CenturyLink responses seem to totally miss the point.

The impact of large-scale Internet failures goes beyond Facebook and iTunes. The failure of the Internet would disrupt many real people’s day-to-day services like online banking, traffic signals, and railroad routing; the sharing of medical records among doctors and hospitals, and the growing “internet of things” that includes household appliances to regional grids of electric power production and transmission.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Follow the Open Source Money

 Matt Asay at Infoworld recently pointed out some interesting data on who really contributes to open source. Wikipedia, the most well-known open-source project, defines open-source software as software whose source code is published and made available to the public, enabling anyone to copy, modify and redistribute the source code without paying royalties or fees. Open-source code can evolve through community cooperation. These communities include individual programmers as well as large companies.

Open sourceAdobe developer Fil Maj used the GitHub REST API to pull public profile information from GitHub users. The REST API is a low-bandwidth protocol used on the internet that allows two software programs to communicate with each other. Using the API, Mr. Maj collected the company field from all 2,060,011 GitHub user profiles who were active in 2017 (“active” meaning ten or more commits to public projects). Using that data, Mr. Maj was able to pull the total number of corporate contributors to GitHub, with results that might surprise you.

Here are the ranking of GitHub contributors, with their total number of employees actively contributing to open source projects on GitHub:

RankCompanyEmployees Contributing
1Microsoft4,550
2Google2,267
3Red Hat2,027
4IBM1,813
5Intel1,314
6Amazon.com881
7SAP747
8ThoughtWorks739
9Alibaba694
10GitHub676
11Facebook619
12Tencent605
13Pivotal591
14EPAM Systems585
15Baidu584
16Mozilla469
17Oracle455
18Unity Technologies414
19Uber388
20Yandex351
21Shopify345
22LinkedIn343
23Suse325
24ESRI324
25Apple292
26Salesforce.com291
27VMware271
28Adobe Systems270
29Andela259
30Cisco Systems233

The author points out, this is not a perfect measure, but it is a much richer, more accurate data set for figuring out total contributors for any company. Even with that caveat in mind, we end up with many more corporate open source contributors than previous data suggested.

Microsoft’s contributions to open source

Microsoft's contributions to open sourceThe new data shows Microsoft (MSFT) is the number 1 open source contributor. Redmond has twice the number of contributors compared to its next nearest competitor. Remember Steve Ballmer‘s developers! developers! developers! meltdown?  For those of us that were around when Mr. Ballmer, the Microsoft CEO called open source as a “cancer” and “anti-American,” this is a remarkable change of heart for MSFT.

Red Hat

Red Hat (RHT) Mr. Maj’s data puts the open source leader among the top contributors. Red Hat has dramatically fewer engineers on its payroll than Google (GOOG) or Microsoft. As such, it’s doubly impressive that Red Hat would place so highly. Pretty much every engineer in the company works on open-source projects.

Amazon

 

Amazon logoAmazon (AMZN) Often considered an open source ne’er-do-well, Amazon comes in at No. 6 in the rankings. AMZN has nearly 900 open source contributors on staff. The article points out that Amazon has perhaps not publicly led the open source effort in the same way as Google and Microsoft have, but it remains a strong contributor to the projects that feed its developer community.

China is a net consumer of open source

Chinese companies like Baidu, Tencent, and Alibaba, which have long been perceived to be net consumers of open source, actually contribute quite a bit according to the new data.

Legacy firms

Legacy firms like Intel (INTC), Oracle (ORCL), Adobe (ADBE), and Cisco (CSCO) rank among the top 30 open source contributors reports InfoWorld.

rb-

Color me suspicious, but have these firms really embraced open source. Have they just adapted their business model to usurp elements of open source to lay their proprietary code on top of it? This saves them the bother of writing new code and yet they can charge proprietary costs for software where they have reduced their development costs.

Tom Brady hanging high fiveAfter all, numbers don’t lie. Stats say that in 2014, half of the companies said they use open source in their product. Just one year later, the number grew to 78%. Consequently, as long as open source continues to enjoy its place in the sun, we should expect the Microsoft-open source bromance to continue.

Related article

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Browser Security Updates

Browser Security UpdatesIf you bank, shop, or work on the Intertubes your security is changing. Your browser Security is changing because Symantec is selling its Website Security and related PKI business to PKI encryption solutions to DigiCert for nearly $1 Billion.

SSL and TLS logoExperts estimate that Symantec (SYMC) owns 40% of the SSL certificate market. SSL/TLS certificates are used to encrypt the connections between browsers and HTTPS-enabled websites. The certificates are used to verify that users are actually visiting the websites they intended to and not spoofed versions. Certificates are issued by organizations known as certificate authorities that are trusted by default in browsers and operating systems.

As a result of the sale, many firms are going to have to reissue SSL/TLS server certificates. The reissued certs will ensure browser security and make sure there is no impact on your online experiences. These certificates are essential to ensure secure, encrypted communication for user interaction on the Intertubes.

Google Chrome browser security

Google (GOOG) has led the effort to decrease the disruption that could come along with this change. Google posted a plan back in July of 2017 regarding Symantec-issued SSL/TLS server certificates.

• In March 2018 Google Chrome (Chrome 66 Beta) will show a warning for sites secured with SSL/TLS certificates issued before June 1, 2016. Your security is at risk and data encryption will function normally, but your transactions will be disrupted by a warning in Chrome.
• Google has also stated that all SSL/TLS certificates that had been issued by Symantec before December 1, 2017, will not be trusted starting in September 2018 (Chrome 70 Beta). Doing transactions at sites that have not been updated will put your security at risk, and you will get a warning in Chrome.

Mozilla Firefox

Mozilla, publisher of the Firefox web browser says that it intends to follow the same timeline proposed by Google.

rb-
This change is a normal procedure for typical certificate renewal. There should be no service disruption when the new certificates are issued as long as your web browser is up to date. There is no reason to have an out-of-date browser anymore. All three major browsers will auto-update. Other keys to staying safe online include:

  • Always check for HTTPS when you plan on providing personal data to a website. Always check for HTTPS
  • Pay attention to any security warnings you receive when you visit a website. Although you can almost always trust the HTTPS you see in your browser URL, any additional warnings from your browser should show that there may be a problem with the connection, so you should proceed with caution.

Nearly 54% of all U.S. web browsers will be affected by these changes. Statista says that Chrome held almost 50% of the browser market share and Firefox held over 5% of the share in December 2017. 41% of Internet users are not covered by this change (Safari 32.7% and IE/Edge 9%).

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

OMG Texting b 25 !

OMG Texting b 25 !This week marks the 25th birthday of text messages. Texting is more properly known as SMS. On Dec. 3, 1992, 22-year-old Sema Group software architect Neil Papworth typed the first SMS (Short Message Service) message, “Merry Christmas” on a computer and sent it over a  GSM network in the UK, to an Orbitel 901 handset owned by then-Vodafone director Richard Jarvis.

 SMS serviceIn 1993, a year after the first text message was sent, Nokia (NOK) set up the first commercial SMS service in Finland. Nokia was the first handset manufacturer whose total GSM phone line supported users sending SMS text messages. In 1997, Nokia became the first manufacturer to produce a mobile phone with a full keyboard: the Nokia 9000i Communicator.

Texting adoption

SMS adoption was slow at first, with only 0.4 text messages sent per month in 1995. The fact that UK users could only send SMS messages to those on the same network was a big problem until the restriction was lifted in 1999.  However, as smartphone technology developed and text messages became easier to use, SMS popularity ballooned. As mobile phones became more popular, texting skyrocketed. By 2007, the Brits were sending 66 billion SMS messages a year and in 2012, they sent 151 billion texts.

Nokia 9000i CommunicatorIn the U.S. SMS was slower to catch on, mainly because mobile operators charged more for texts and less for voice calls, and because of the popularity and availability of PC-to-PC instant messaging or IM. However, in the United States, 45 billion text messages were sent per month in 2007, a figure that became 167 billion per month in 2011. In June 2017, 781 billion text messages were being sent in the United States per month according to the experts.

U.S. Texts Sent

MonthNumber of Text Messages Sent Each MonthIncreased Number of Text Messages Sent YoY% Increased Number of Text Messages Sent YoY
June 2017
781.000,000,000147,000,000,000431.3%
June 2016634,000,000,00073,000,000,000768.5%
June 2014561,000,000,00063,000,000,000790.5%
June 2013498,000,000,00075,000,000,000564.0%
June 2012423,000,000,00056,000,000,000655.4%
June 2011367,000,000,000126,000,000,000205.8%
June 2010247,000,000,00086,000,000,000187.2%
June 2009161,000,000,00086,000,000,00087.2%
June 200878,000,000,00030,000,000,000150.0%
June 200745,000,000,00032,500,000,00038.5%
June 200612,500,000,0005,250,000,000138.1%
June 2005
7,250,000,0004,390,000,00065.1%
June 20042,860,000,0001,660,000,00072.3%
June 20031,200,000,0002270,000,000344.4%
June 200133,000,00021,000,00057.1%
June 200012,000,000
Text Message Statistics – United States from Statistic Brain (www.statisticbrain.com)

With 25 years under its belt, many people wonder if the end of the line is near for SMS. This is because apps such as Apple‘s (AAPL) iMessage, Google‘s (GOOG) Hangouts, Facebook‘s (FB) Messenger, WhatsApp, and SnapChat have become very popular.

Closed systems

Chat applicationThese new chat applications also marked a more fundamental shift away from an open standard that anyone could use (even if your operator charged you) to closed messaging systems controlled by technology giants. Text messages, however, might not be going away soon. SMS is a very practical and easy-to-use communication method, especially for areas and countries that do not have reliable internet connections.

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Are You a Human

Are You a HumanDetroit-based Are You a Human was recently purchased by Virginia-based Distil Networks. The purchase is part of Distil’s efforts to expand its bot-detection capabilities. As part of the acquisition, the Human Tag will be re-branded as Distil Bot Discovery. Distil will open an office in Detroit and increase its presence in Motown. All 10 of Are You A Human’s employees are staying on, according to reports.

The firm’s website describes the Are You Human technology;

[Are You Human] collects hundreds of fingerprinting metrics and analyzes user’s device, software, and natural behavior to develop robust behavioral metrics on each page view in real-time … Only through an expert understanding of natural human characteristics and behavior is it possible to identify the 99% of non-human traffic caused by new and unique bots that fraud detection and verification systems can’t find

suite of bot-detection productsDistill Networks will add A You a Human’s real-time analysis technology and biometric information to its own suite of bot-detection products and use it to launch a free bot-discovery plugin for Google Analytics. Detecting bots is important because they can inflate website traffic numbers or present a security risk by searching for sensitive information.

The firm cited the Motor City as being:

… incredibly helpful and supportive to us, and we can’t imagine doing this anywhere else. Being able to build this company in Detroit has been hugely meaningful to all of us, and we’ll still be part of that awesome community going forward.

Detroit skyline

 

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.