Tag Archive for Michigan

Security Cam Concerns in Ann Arbor

Security Cam Concerns in Ann ArborNext time you are in Ann Arbor to get a bite to eat at Zingerman’s or attend a U of M football game at Michigan stadium someone may be watching you. NetworkWorld, says Ann Arbor is one of the top U.S. cities with the most unsecured security cameras. In fact, Ann Arbor ranks seventh nationally.

The report’s author, security firm Protection 1, analyzed the data from Insecam. Inseacam identifies open security cameras and Protection 1 estimates there are over 11,000 open security cameras on the Internet in the U.S. Protection 1 identified the cities with the most cameras that can be viewed by anyone online. The top 10 cities with unsecured security cameras are:

  1. open security camerasWalnut Creek, CA – 89.69 / 100,000 residents
  2. Richardson, TX – 72.74 / 100,000 residents
  3. Torrance, CA – 72.55 / 100,000 residents
  4. Newark, NJ – 38.07 / 100,000 residents
  5. Rancho Cucamonga, CA – 36.76 / 100,000 residents
  6. Corvallis, OR – 37.98 / 100,000 residents
  7. Ann Arbor, MI – 34.18 / 100,000 residents
  8. Orlando, FL – 34.05 / 100,000 residents
  9. Eau Claire, WI – 22.21 / 100,000 residents
  10. Albany, NY – 20.32 / 100,000 residents

using the manufacturer's default passwordOpen security cameras connect to the Internet via Wi-Fi or a cable. They have no password protection or are using the manufacturer’s default password. Malicious people and governments can record or broadcast our lives from unprotected open security cameras. Open cameras are also vulnerable attacks that can turn them into bots.

From a privacy perspective, the most worrisome finding is that 15% of the open cameras are in Americans’ homes. Anyone can watch these cameras if the default password is not changed to a unique password to lock down the camera.

Besides being spied on from the web, open cameras can be exploited by criminals. Cyber-criminals can force online cameras to attack other things on the Internet as part of a DDoS attack.

distributed denial-of-service (DDoS)A DDoS attack against a jewelry shop website led to the discovery of a CCTV-based botnet. A distributed denial-of-service (DDoS) attack is one in which a multitude of compromised systems attack a single target, thereby causing a denial of service for users of the targeted system. TargetTech says the flood of incoming messages to the target system essentially forces it to shut down, thereby denying service to the system to legitimate users.

Help Net Security reports that Sucuri researchers discovered the jewelry site was being attacked by a CCTV botnet made up of 25,000+ cameras from around the globe. The website was first attacked by a layer 7 attack (HTTP Flood) at 35,000 HTTP requests per second and then, when those efforts were thwarted, with 50,000 HTTP requests per second.

Sucuri researchers discovered that all the attacking IP addresses had a similar default page with the ‘DVR Components’ title. After digging some more, they found that all these devices are BusyBox based. Busybox is a GNU-based software that aims to be the smallest and simplest correct implementation of the standard Linux command-line tools.

CCTV botnet made up of 25,000+ cameras from around the globeThe compromised CCTV cameras were located around the globe:

  • 24% originated from Taiwan,
  • 12% United States,
  • 9% Indonesia,
  • 8% Mexico,
  • and elsewhere.

rb-

Unless something is done, security flaws, misconfiguration, and ignorance about the dangers of connecting unsecured devices to the IoT will keep these botnets functioning well into the future.

block or absorb malicious trafficTo protect your website from botnets and DDoS, you need to be able to block or absorb malicious traffic. Firms should talk to their hosting provider about DDoS attack protection. Can they route incoming malicious traffic through distributed caching to help filter out malicious traffic — reducing the strain on existing web servers. If not find a reputable third-party service that can help filter out malicious traffic.

DDoS defense services require a paid subscription, but often cost less than scaling up your own server capacity to deal with a DDoS attack.

Arbor Networks is one firm that provides services and devices to defend against DDoS.

Google has launched Project Shield, to use Google’s infrastructure to support free expression online by helping independent sites mitigate DDoS attack traffic.

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Michigan Leader in SPAM

Michigan Leader in SPAMIn a surprise finding, the New Jersey based anit-malware company Comodo’s Threat Research Labs found that Michigan is one of the leading sources of unsolicited e-mail on the Internet. Unsolicited bulk email is also known as “SPAM.” SPAM is usually considered junk e-mail. The Great Lake state ranked third behind California and New York in spewing out the most SPAM.

MichiganThe Comodo researchers examined all the emails Comodo filtered for customers in the second half of 2015, specifically looking at SPAM. In doing their research, they conducted an IP address analysis of the millions of pieces of email SPAM that came into the Threat Research Labs from their customers.

Through this analysis, researchers have been able to break down SPAM by state and find where it originated from. IP addresses from California (24.37%) and New York (22.36%) sent nearly half of the spam Comodo filtered, while Utah (19.42%), Michigan (10.79%), and New Jersey (3.68%) IP addresses rounded out the top five states.

Comodo State SPAM Map

Fatih Orhan, Director of Technology and lead at the Comodo Threat Research Labs said:

California and New York were not really surprising in terms of the top two states because of population and technology innovation taking place in those geographies — but finding Utah and Michigan in the top five was somewhat shocking

rb-

I have followed the battle against SPAM since 2009. Here are some tips to help protect yourself from SPAM

  • Keep your Junk E-mail Filter updated

Updates are available at Downloads on Office Online. Under Office Update, click Check for Updates.

  • Block images in HTML messages that spammers use as Web beacons

By default, Outlook is set to block automatic picture downloads. To verify your settings are, on the Tools menu, click Options. Click the Security tab, and then click Change Automatic Download Settings. Verify that the Don’t download pictures or other content automatically in HTML e-mail check box is selected.

  • Watch out for checkboxes that are already selected

When you buy things online, companies sometimes add a check box (already selected!) to indicate that it is fine to sell or give your e-mail address to other businesses. Clear the check box so that your e-mail address won’t be shared.

  • DO NOT sign up for commercial mailing lists.
  • DO NOT reply to email or unsubscribe from a mailing list that you did not explicitly sign up for.
  • Configure your email client to send and receive emails in Plain Text or Rich Text Format.

For Microsoft Outlook go to: Tools > Options… and click the Mail Format Tab. Change your Message format to Text Click OK.

Lest we forget, this is the same Comodo that was responsible for releasing 9 fraudulent certificates onto the Internet which, Sophos says impacted the trusted root authority on all default Windows and OS X installations, as well as high-profile websites like:
mail.google.com
www.google.com
login.yahoo.com (3 certificates)
login.skype.com
addons.mozilla.org

Sophos states that this breach allowed an attacker to easily masquerade a malicious website as one of the above with the HTTPS authentication succeeding.

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Is Your Data Center Underwater?

Is Your Data Underwater?Every time you like something on Facebook, it causes a computer in a cloud data center somewhere in the world to do something. That computer uses electricity to let the world know you like the sleepy puppy video or what you dinner looked like.

computers produce heatAs you may have noticed if you left your laptop on your lap for too long computers also produce heat. Facebook (FB), Twitter (TWTR), Instagram, and all the other time-wasters have millions of computers generating excess heat that needs to go somewhere. It is estimated that Facebook alone has hundreds of thousands of servers.

Keep servers cool

One of the ways to keep servers cool is to keep them wet. As count-intuitive as that seems, there are companies that use liquid immersion to cool their servers according to the Register. This approach uses data centers featuring large ‘baths’ filled with a dielectric liquid into which racks of equipment are submerged.

Green Revolution Cooling CarnotJetMineral oil has been used in immersion cooling before Perhaps the best-known proponent of liquid immersion cooling is Green Revolution Cooling. Its CarnotJet system allows rack-mounted servers from any OEM to be dunked in special racked baths filled with a dielectric mineral oil blend called ElectroSafe (PDF), an electrical insulator it claims to have 1,200 times more heat capacity by volume than air.

Green Revolution Cooling claims cooling energy reductions of up to 95 percent, server power savings of 10-25%, data center build-out cost reductions of up to 60% through simplified architecture, and improved server performance and reliability as a result of less exposure to dust (and moisture).
Microsoft has taken this technology to the next level. Now, Microsoft is experimenting with locating entire data centers underwater.

Microsoft underwater data center

Microsoft logoComputerWorld is reporting that Microsoft has designed, built, and deployed its own sub-sea data center in the ocean, in the period of about a year. The Redmond, WA firm started working on the project in late 2014. Microsoft employee, Sean James, who served on a U.S. Navy submarine, submitted a paper on the concept.

The eight-foot diameter steel prototype vessel, named after the Halo character Leona Philpot, operated 30 feet underwater on the Pacific Ocean seafloor, about 1 kilometer off the California coast near San Luis Obispo for 105 days from August to November 2015, according to Microsoft. Microsoft engineers remotely controlled the data center and even ran commercial data-processing projects from Microsoft’s Azure cloud computing service in the submerged data center.

Project NatickThe sub-sea data center experiment, called Project Natick after a town in MA, is in the research stage and Microsoft warns it is “still early days” to evaluate whether the concept could be adopted by the company and other cloud service providers. Microsoft says,

Project Natick reflects Microsoft’s ongoing quest for cloud data center solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.

Microsoft believes that using undersea data centers can serve the 50% of people who live within 200 kilometers of the ocean. They say that deployment in deep-water offers “ready access to cooling, renewable power sources, and a controlled environment.” Moreover, a data center can be deployed from start to finish in 90 days.

Microsoft is weighing coupling the data center with a turbine or a tidal energy system to generate electricity, according to the New York Times.

Environmental impact

A new trial is expected to begin next year, possibly near Florida or in Northern Europe, Microsoft engineers told the NYT.

environmental impactSome users questioned whether an undersea data center could have an environmental impact, including the heating up of the water around the data center. But Microsoft claimed on its website that the project envisages the use of data centers that would be totally recycled and would also have zero emissions when located along with offshore renewable energy sources. MSFT told Computerworld

No waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment … During our deployment of the Leona Philpot vessel, sea life in the local vicinity quickly adapted to the presence of the vessel.

rb-

I have covered some other alternative ways to deal with data centers on Bach Seat, including HP’s plans to use cow manure to generate electricity and Microsoft’s plan to use sewer gas to power a data center in Wyoming.

Underwater data centers are an attractive idea, there are challenges. One is a concern is the saltwater could corrode the structures. This issue can be resolved by locating the data centers in the freshwater Great Lakes. The Great Lakes basin is projected to reach a population of about 65 million by 2025.

The region includes:

Related article

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Ford to Make Google Cars

Ford to Make Google Cars The 2016 North American International Auto Show started today at Cobo Center in Detroit so let talk about autonomous cars. Ford and Google are in talks to have the Dearborn,MI-based automaker build Google’s next-generation autonomous cars under contract, Automotive News has learned. A source with knowledge of the project says both parties have been negotiating on the deal “for a long time.” An announcement, if finalized, could come as early as the International Consumer Electronics Show in Las Vegas.

Ford logoNeither firm would confirm the reports for the record. Google (GOOG) officials did confirm that the company is talking to automakers. Ford Motor Company (F) official Alan Hall did say, “We work with a lot of tech companies all over the world. We keep these discussions private for obvious competitive reasons and we do not comment on speculation.

Google loading up auto executives

To fan the rumors, two veteran Ford executives have recently joined Google. Former CEO Alan Mulally joined Google’s board of directors eight days after he retired from the automaker on July 1, 2014. Then in September, Google hired John Krafcik as CEO of the company’s Self-Driving Car Project. Mr. Krafcik, who most recently was president of TrueCar Inc., was CEO of Hyundai Motor America. He spent 14 years at Ford, including a stint as chief engineer during the development of the Ford Expedition SUV.

Google logoFord is scheduled to hold a press conference on Jan. 5 in Las Vegas. Ford CEO Mark Fields, product development chief Raj Nair, research and advanced engineering vice president Ken Washington, and Don Butler, executive director of connected vehicles and services, are scheduled to attend.

Yahoo Autos reported on the negotiations, quoting three sources familiar with the deal. The sources said the deal would create a joint venture legally separate from Ford. The venture would shield Ford from potential liability. The agreement, if completed, also would be non-exclusive, meaning Google could negotiate a similar deal with another automaker.

Autonomous vehicle

CEO Fields recently gave Auto News an update on Ford’s Smart Mobility efforts. The initiative would bolster the company’s expertise in car-sharing and other new business models for transportation. He said. “It’s not about just going from an old business to a new business. It’s about going to a bigger business.

Auto News theorizes that a Ford deal with Google would fit within the strategy laid out by CEO Fields. He commented during an interview:

It’s not only about what are the things that are going to be core to us but who are we going to partner with. I don’t think we can just be so arrogant to think that we’re going to do everything on our own and we’re going to do something better than maybe a company that does that 24/7. For us, partnerships are really important.

New mobility models beyond cars

During a visit to Ford’s Silicon Valley research facility in Palo Alto, CA, Mr. Fields signaled that Ford sees new mobility models as a way to grow its business. When asked why Ford is developing its own software for self-driving cars, rather than striking a deal to use best-in-class software from an outside vendor. Ford’s Fields joked that Silicon Valley practically invented the concept of “frenemies.” In a corporate context, that means companies are willing to simultaneously collaborate on projects and compete against one another. Ford’s R&D center is working on self-driving software, Mr. Fields said, “that doesn’t mean we won’t work with others. I think that’s part of the beauty of being here.

PartnersSuch a partnership would mark another step toward the marketplace for Google. Bloomberg reported that Google is thinking of putting its technology into automated taxis as a rival for Uber and Lyft. Google may spin-off the unit into a standalone business within its new Alphabet Inc. corporate structure in 2016.

Ties between Ford and Google

It isn’t clear whether Ford would design a purpose-built vehicle for Google or supply a standard production car fitted with the sensors and computers that the car needs to guide itself down the road.

Having Ford build Google’s test fleet would save the Silicon Valley tech giant years and billions in development costs. The Ford-built vehicles would use the automaker’s production-ready powertrain as well as safety and emissions components.

There are already ties between Ford and Google. Google’s first generation of 100 self-driving vehicles were assembled in Detroit by Roush Industries, a company closely aligned with Ford. The bubble-shaped cars, as Crain’s Detroit Business reported used components from local Detroit area suppliers.

Thilo Koslowski, lead automotive analyst at Gartner (IT) in Santa Clara, CA said it makes sense automakers would want to work with Google, which could help them catch up to rivals that are pursuing automated driving to differentiate their products.

And at Google, “the focus has shifted to looking for OEM partners to deploy the technology, rather than considering building their own vehicles,” The Gartner analyst said. “That makes sense. If Google is interested in bringing the benefit of the technology to consumers, then they need as many partners as possible.”

Ford and Google are said to have been in talks since at least 2012 on autonomous cars. The two companies also teamed up in 2011 on technology that would help vehicles learn customers’ driving habits and get them to destinations more efficiently.

VP Washington said recently that he expects fully autonomous vehicles to be ready within four years. Ford has secured approval from California to test its own autonomous cars in California. Ford has been testing autonomous Hybrid Fusion’s at the University of Michigan’s 32-acre simulated city Mcity.

rb-

Autonomous cars will increase the direct impact of the Internet of Things (IoT). With all of IoT’s inherent security and connectivity issues.

 

Related articles

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.

Happy Thanksgiving

Thanksgiving 2015

 

Happy Thanksgiving 2015

Detroit News November 30 1967 J L Hudson’s Thanksgiving Day Parade

 

 

Ralph Bach has been in IT long enough to know better and has blogged from his Bach Seat about IT, careers, and anything else that catches his attention since 2005. You can follow him on LinkedInFacebook, and Twitter. Email the Bach Seat here.