Tag Archive for Cloud computing

Please Take Lotus Notes

Please Take Lotus NotesIn a move to free up some cash and make room for its $34 billion acquisition of Red Hat Inc. (RHT), IBM (IBM) is selling off its enterprise software business for $1.8 billion to HCL Technologies.

Please Take NotesHCL Technologies is global services company valued at $8 billion. India-based HCL operates out of 43 countries, serving the financial services, manufacturing, telecommunications, media, publishing, entertainment, retail, and other industries.

Lotus Notes

The sale includes most of IBM’s enterprise business, including Lotus Notes and Domino collaboration software, network management software Tivoli, and other titles. Lotus Notes was developed by Mitch Kapor in 1989 and was a pioneering enterprise software tool that swept the market with features such as email and collaboration workspaces, that we now take for granted.

Lotus 1-2-3 for DOSLotus, founded in 1982, rose to fame in 1983 with the Lotus 1-2-3 spreadsheet, which drove the popularity of freshly minted IBM PC. IBM took over Lotus for the then astounding sum of $3.52 billion. IBM looked to the Lotus acquisition to change its white-shirt-and-tie culture to embrace the MTV age and the new Internet.

Lotus Notes and Domino ranked among the top client-server groupware and email systems in the 1990s, competing head-on against Microsoft Exchange. While Microsoft successfully migrated Exchange to Office 365 in the cloud, Notes and Domino largely missed the cloud era.

Lotus NotesBig Blue acquired Tivoli for $743 million in 1996. It ranked among the leading IT management software providers, competing against CA Technologies, BMC, and HP in the 1990s and early 2000s. Each of those companies stumbled in recent years — opening the door for ServiceNow to disrupt major portions of the market.

The IBM world-view

The HCL deal highlights IBM’s failure to navigate the shift from client-server to SaaS. Lotus Notes stayed a client-server system and lost business to Amazon Web Services (AWS) and Microsoft Azure and Google Cloud Platform.

Now that the business has been lost, IBM is moving in a different direction. Older software like Lotus Notes and Domino don’t really play a role in the new IBM world-view. One IBM solution provider told CRN,I can understand getting rid of Lotus Notes and Domino Microsoft Office 365 and Google Apps are killing the hell out of Lotus Notes.

In addition to Lotus Notes, Domino, and Tivoli, the IBM Software asset sale to HCL includes:

  • IBM Appscan, a security-focused application for identifying and managing vulnerabilities in mission-critical applications;
  • IBM BigFix endpoint management and security software;
  • IBM Unica, a cloud-based enterprise marketing automation software; and
  • IBM WebSphere Commerce, an omnichannel commerce platform for B2C and B2B organizations.

rb-

While I am the PM on our move off of Notes to SaaS products like O365, every once in a while I find myself saying that Notes worked well. But then I remember that it is overly complex and proprietary. The client software is huge and bloated and lacks a simple client.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Barracuda Networks Has Been Bought

Barracuda Networks Has Been BoughtWhile the massive Equifax data breach is still fresh in everyone’s minds and the cybersecurity workforce is expected to be short nearly 2 million people. IT security expenditures to top $1 Trillion by 2022. Private equity giant Thoma Bravo, LLC has jumped back into the IT security market with both feet. Barracuda Networks has been bought by the private equity firm in a deal that’s valued at $1.6 billion.

BarracudaBarracuda (CUDA) sells appliance and cloud-based cybersecurity and data protection services. Clients include; Boeing, Microsoft and the U.S. Department of Defense. Barracuda says it has over 150,000 customers. Upon the close of the transaction, Barracuda will operate as a privately held company.

Barracuda Networks has been bought

Barracuda Network was founded in Ann Arbor, Michigan in 2003. From Ann Arbor, it raised at least $46 million in venture funding prior to its IPO. CUDA went public on the New York Stock Exchange in November 2013, pricing its IPO at $18. Barracuda acquired Yosemite Technologies in 2009 to expand its offerings into the storage market.

Barracuda NexGen FirewallBarracuda continued to innovate in the run-up to its acquisition. eWeek reports that in March 2017, Barracuda debuted new data backup and recovery capabilities for VMware and Microsoft virtual machines. In June 2017 Barracuda announced its new Sentinel service. The service uses artificial intelligence (AI) and container-based technologies to improve email security.

Barracuda also enhanced its network security products and services in 2017. eWeek reported in November that the company expanded the cloud capabilities for its Web Application Firewall (WAF) and NexGen Firewall products. The new capabilities include usage-based billing for the NextGen firewall running in the Amazon Web Services (AWS) cloud. The firewall included automated configuration capabilities for the WAF, thanks to an integration with the Puppet DevOps tool.

CEO BJ Jenkins commented on the transaction, “We will continue Barracuda’s tradition of delivering easy-to-use, full-featured solutions that can be deployed in the way that makes sense for our customers.

Thoma Bravo

Thoma Bravo is a Chicago-based private equity firm with $17 billion under management. Their appetite for IT firms is rather broad. Some of it’s most notable purchases have been:

  • Thoma Bravo is a Chicago-based private equity firmSeptember 2014 – $2.4 billion purchase of Detroit-based Compuware.
  • December 2014 – $3.6 billion acquisition of Riverbed.
  • In October 2015, they teamed up with Silver Lake to buy IT infrastructure management vendor SolarWinds for $4.5 billion.
  • April 2017 – Purchased a minority stake in the freshly re-spun McAfee.
  • June 2017 they purchased Remote Monitoring and Management (RMM), IT security management vendor Continuum.

Their portfolio has included brands such as; Bomgar, Digicert, Digital Insight, Dynatrace, Hyland Software, Imprivata, iPipeline, Nintex, PlanView, Qlik, SailPoint, and SonicWall.

Thoma Bravo has resold many of its holdings in recent years.

TechCrunch notes that private equity firms began more aggressively buying up software companies last year. The thinking seems to be they can generate reliable returns from such investments. The biggest take-private deals lately include:

  • Marketo, a marketing software maker. Went public in 2013 and was taken private again by Vista Equity Partners in 2017 for $1.79 billion in cash;
  • The sale of event-management company Cvent last year to Vista Equity Partners in a $1.65 billion deal.
  • Cybersecurity risk-monitoring platform SecurityScorecard raised $27.5 million from the VC arms of Google, Nokia, and Intel.

Other notable IT security equity funding recipients include; Attivo NetworksDarktrace, and SentinelOne.

Investopedia speculates that Thoma Bravo is paying a pretty high premium for Barracuda. CUDA now trades at 139 times earnings and 4 times sales. But under private management, its products will likely be integrated with the firm’s other software products to generate synergies.

CRN notes that being a privately owned company will give Barracuda a stronger ability to chart its own destiny. They will not have to “tap-dance to the Wall Street music,” Michael Knight, president and chief technology officer at solution provider Encore Technology Group, Greenville, S.C., said. He hopes Thoma Bravo’s infusion of capital will enable Barracuda to continue driving its public cloud business, a more solidified SD-WAN toolset, and more integrated endpoint security protection.

Rb-

I have used Barracuda products at past jobs. Including their SPAM-Email firewall appliances and their cloud-based backup up system. The pricing was adequate. Renewals were easy. The email firewalls were really robust and almost set and forget.

The few times when I needed tech support, it was available in Ann Arbor, Michigan. Barracuda, founded in Ann Arbor, was one of the early believers in the area as a high-tech hub. Barracuda has plans to spend  $2.3 million on the expansion of its operations center in the former Borders Books offices at 317 Maynard Street. The expansion will add 115 new jobs in downtown Ann Arbor over the next four years. I hope that after Barracuda Networks has been bought by Thoma Bravo, the deal does not have a “Chainsaw Al” that will kill that growth.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Chatbot Risks

Chatbot RisksChatbots are the latest rage on social media. As Time explained, they have been around since the 1960s. That’s when MIT professor Joseph Weizenbaum created a chatbot called ELIZA. Chatbots found a home on desktop messaging clients like AOL Instant Messenger. Chatbots went dormant as messaging transitioned away from desktops and onto mobile devices.

Sophiscated botBut they’re poised for a resurgence in 2016. There are two reasons for this. First, artificial intelligence and cloud computing has gotten better thanks to improvements in machine learning. Second, bots could be big money.

Tech titans have chatbots on social media

All the tech titans have released social bots on the web; Apple’s (AAPL) Siri, Facebook’s (FB) “bots on Messenger“, Google’s (GOOG) Allo, and Microsoft’s (MSFT) ill-fated Tay. They believe there’s a buck to be made here, and they’re scrambling to make sure they don’t get left out.

Social botThe July issue of the Communications of the ACM included an article, “The Rise of Social Bots,” which lays out social bots’ impact on online communities and society at large. The authors define a social bot as a computer algorithm that automatically produces content and interacts with humans on social media, trying to emulate and possibly alter their behavior.

The Business Insider published this infographic about the social bot ecosystem.

Business Insider infographic

Chatbots can be deceptive

The ACM article argues that social bots populate techno-social systems; they are often benign, or even useful, but some are created to harm by tampering with, manipulating, and deceiving social media users. The article offers several examples of how social bots can be a hindrance. The first example involves the Twitter (TWTR) posts around the Boston Marathon bombing. The researcher’s analysis found that social bots were automatically retweeting false accusations and rumors. The researchers argue that forwarding false claims without verifying the false tweets granted the false information more influence.

bots can artificially inflate political candidatesThe ACM article also discusses how social bots can artificially inflate political candidates. During the 2010 mid-term elections some politicians used social bots to inject thousands of false tweets to smear their opponents. This type of activity puts the integrity of the democratic process at risk. These types of attackers are also called astroturfing, or twitter-bombs.

Anti-vaxxer chatbots

The article offers another example of the use of social bots to influence an election in California. During the recent debate in California about a law on vaccination requirements there appears to be widespread use of social bots by opponents to vaccinations. This social bot interference puts an unknown number of people at risk of death or disease.

bot provoked stock market crashGreed is the most likely use of social bots. One example from the article is the April 2013 hack of the Twitter account of the Associated Press. In this case, the Syrian Electronic Army used the hacked account to posted a false statement about a terror attack on the White House which injured President Obama. This false story provoked an immediate $136 Billion stock market crash as an unwarranted result of the widespread use of social bots to amplify false rumors.

Chatbots manipulate social media reality

Research has shown that human emotions are contagious on social media. This means that social bots can be used to artificially manipulate social media users’ perception of reality without being aware they are being manipulated. The article says the latest generation of Twitter social bots has many “human-like” online behaviors that make it difficult to separate bots from humans. According to the authors, social bots can:

  • Search the web to fill in their profiles,
  • Post pre-collected content at a defined time
  • Engage in conversations with people,
  • Infiltrate discussions and add topically correct information.

Some bots garner attention.Some bots work to gain greater status by searching out and following popular or influential users or taking other steps to garner attention. Other bots are identity thieves, adopting slight variants of user names to steal personal information, picture, and links.

Strategies to thwart bad chatbots

The authors review several attempts to thwart these growing sophisticated bots.

1. Innocent-by-association – This theory measured the number of legitimate links vs. the number of social bots (Sybil) links a user has. This method was proven to be flawed. Researchers found that Facebook users are pretty indiscriminate when adding users. The article says that 20% of legitimate Facebook users accept any friend request and 60% accept friend requests with only one contact in common.

2. Crowdsourcing – Another approach to stop social bots is crowdsourcing. The crowdsourcing approach would rely on users and experts reviewing an account. The reviewers would have to reach a majority decision that the account in question was a bot or legit. The authors pointed out some issues with crowdsourcing.

  • It will not scale to large existing social networks like Facebook or Twitter.
  • “Experts” need to be paid to check accounts.
  • It exposes user’s personal information related to the account to unknown users and “experts.”

3. Feature-based detection is the third method the researchers noted by the authors. Feature-based bot detection uses behavior-based analysis with machine learning to separate human-like behavior from bot-like behavior. Some of the behaviors that these types of applications include:

  • The number of retweets.
  • Age of account.
  • Username length.

4. Sybil until proven otherwise – The Chinese social network RenRen uses the fourth method noted by the author. This network uses a “Sybil until proven otherwise” approach. According to the article, this approach is better at detecting unknown attacks, like embedding text in graphics.

rb-

Use your brainWhile people’s ability to critically assimilate information, is beyond technology, the authors call for new ways to detect social bot-generated spam vs. real political discourse.

The researchers speculate there will not be a solution to the social bot problem. The more likely outcome is a bot arms race, like what we are seeing in the war on SPAM and other malware.

Related articles
  • Man vs. Machine: What do Chatbots Mean for Social Media? (blogs.adobe.com)

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Data Center in Space

Data Center in SpaceCloud computing is old technology. An LA-based start-up wants to move your data beyond the cloud. Cloud Constellation wants to store your data in space. The firm is planning on building a satellite-based data center that will have room for petabytes of data and may start orbiting Earth as early as 2019 according to Computerworld.

spacebelt_logoCEO Scott Sobhani told the author Cloud Constellation is looking upward to give companies and governments direct access to their data from anywhere in the world. Its data centers on satellites would let users bypass the Internet and the thousands of miles of fiber their bits now have to traverse in order to circle the globe. And instead of just transporting data, the company’s satellites would store it, too.

The article describes the pitch like this – Data centers and cables on Earth are susceptible to hacking and to national regulations covering things like government access to information. They can also slow data down as it goes through switches and from one carrier to another, and all those carriers need to get paid.

petabytes of data orbiting EarthCloud Constellation’s system, called SpaceBelt, would be a one-stop-shop for data storage and transport. Need to set up a new international office? No need to call a local carrier or data-center operator. Cloud Constellation plans to sell capacity on SpaceBelt to cloud providers that could offer such services.

Security is another selling point. Data centers on satellites would be safe from disasters like earthquakes, tornadoes, and tsunami. Internet-based hacks wouldn’t directly threaten the SpaceBelt network. The system will use hardware-assisted encryption, and just to communicate with the satellites an intruder would need an advanced Earth station that couldn’t just be bought off the shelf, Mr. Sobhani told ComputerWorld.

How do you reboot a server in space?Cloud Constellation’s secret sauce is a technology that it developed to cut the cost of all this from US$4 billion to about US$460 million, Sobhani said. The network would begin with eight or nine satellites and grow from there. Together, the linked satellites would form a computing cloud in space that could do things like transcode video as well as storing bits. Each new generation of spacecraft would have more modern data center gear inside.

satelite network

The company plans to store petabytes of data across this network of satellites. Computerworld points out that the SpaceBelt hardware would have to be certified for use in space. Hardware in space is more prone to bombardment by cosmic particles that can cause errors. Most computer gear in space today is more expensive and less advanced than what’s on the ground, satellite analyst Tim Farrar of TMF Associates said.

satelliteTaneja Group storage analyst Mike Matchett told the author that the idea of petabytes in space is not as far-fetched as it may sound. A petabyte can already fit on a few shelves in a data center rack, and each generation of storage gear packs more data into the same amount of space. This is likely to get better even before the first satellites are built.

But if you do put your data in space, don’t expect it to float free from the laws of Earth. Under the United Nations Outer Space Treaty of 1967, the country where a satellite is registered still has jurisdiction over it after it’s in space, said Michael Listner, an attorney and founder of Space Law & Policy Solutions. If Cloud Constellations’ satellites are registered in the US, for example, the company will have to comply with subpoenas from the U.S. and other countries, he said.

United Nations Outer Space Treaty of 1967And while the laws of physics are constant, those on Earth are unpredictable. For example, the US hasn’t passed any laws that directly address data storage in orbit, but in 1990 it extended patents to space, said Frans von der Dunk, a professor of space law at the University of Nebraska. “Looking towards the future, that gap could always be filled.”

rb-

On the Bach Seat, we have covered different theories about data centers several times. These theories included manure, sewer gas, and used cars to power DC’s as well as proposed data centers underwater and at KMart. This one however seems the most unique, considering the start-up costs to build and launch satellites.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Is Your Data Center Underwater?

Is Your Data Underwater?Every time you like something on Facebook, it causes a computer in a cloud data center somewhere in the world to do something. That computer uses electricity to let the world know you like the sleepy puppy video or what you dinner looked like.

computers produce heatAs you may have noticed if you left your laptop on your lap for too long computers also produce heat. Facebook (FB), Twitter (TWTR), Instagram, and all the other time-wasters have millions of computers generating excess heat that needs to go somewhere. It is estimated that Facebook alone has hundreds of thousands of servers.

Keep servers cool

One of the ways to keep servers cool is to keep them wet. As count-intuitive as that seems, there are companies that use liquid immersion to cool their servers according to the Register. This approach uses data centers featuring large ‘baths’ filled with a dielectric liquid into which racks of equipment are submerged.

Green Revolution Cooling CarnotJetMineral oil has been used in immersion cooling before Perhaps the best-known proponent of liquid immersion cooling is Green Revolution Cooling. Its CarnotJet system allows rack-mounted servers from any OEM to be dunked in special racked baths filled with a dielectric mineral oil blend called ElectroSafe (PDF), an electrical insulator it claims to have 1,200 times more heat capacity by volume than air.

Green Revolution Cooling claims cooling energy reductions of up to 95 percent, server power savings of 10-25%, data center build-out cost reductions of up to 60% through simplified architecture, and improved server performance and reliability as a result of less exposure to dust (and moisture).
Microsoft has taken this technology to the next level. Now, Microsoft is experimenting with locating entire data centers underwater.

Microsoft underwater data center

Microsoft logoComputerWorld is reporting that Microsoft has designed, built, and deployed its own sub-sea data center in the ocean, in the period of about a year. The Redmond, WA firm started working on the project in late 2014. Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.

The eight-foot diameter steel prototype vessel, named after the Halo character Leona Philpot, operated 30 feet underwater on the Pacific Ocean seafloor, about 1 kilometer off the California coast near San Luis Obispo for 105 days from August to November 2015, according to Microsoft. Microsoft engineers remotely controlled the data center and even ran commercial data-processing projects from Microsoft’s Azure cloud computing service in the submerged data center.

Project NatickThe sub-sea data center experiment, called Project Natick after a town in MA, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers. Microsoft says,

Project Natick reflects Microsoft’s ongoing quest for cloud data center solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.

Microsoft believes that using undersea data centers can serve the 50% of people who live within 200 kilometers of the ocean. They say that deployment in deep-water offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.

Microsoft is weighing coupling the data center with a turbine or a tidal energy system to generate electricity, according to the New York Times.

Environmental impact

A new trial is expected to begin next year, possibly near Florida or in Northern Europe, Microsoft engineers told the NYT.

environmental impactSome users questioned whether an undersea data center could have an environmental impact, including the heating up of the water around the data center. But Microsoft claimed on its website that the project envisages the use of data centers that would be totally recycled and would also have zero emissions when located along with offshore renewable energy sources. MSFT told Computerworld

No waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment … During our deployment of the Leona Philpot vessel, sea life in the local vicinity quickly adapted to the presence of the vessel.

rb-

I have covered some other alternative ways to deal with data centers on Bach Seat, including HP’s plans to use cow manure to generate electricity and Microsoft’s plan to use sewer gas to power a data center in Wyoming.

Underwater data centers are an attractive idea, there are challenges. One is a concern is the saltwater could corrode the structures. This issue can be resolved by locating the data centers in the freshwater Great Lakes. The Great Lakes basin is projected to reach a population of about 65 million by 2025.

The region includes:

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.