09 March 2011

Juniper Up Next with Cloud Switches


Juniper's data center announcement next week is expected to include switches based on new silicon that allows them to establish a flat fabric for cloud computing.

Sources say Juniper will unveil its Stratus line of data center and cloud switches on Feb. 23. Juniper announced its Stratus project two years ago as a flat, low-latency, lossless switching fabric for high-performance computing environments for enterprises and service providers.

The Stratus line will include 10-Gigabit Ethernet top-of-rack and core switches, sources say, as well as Juniper's entry into 40 Gigabit Ethernet. Inter-switch links between the top-of-rack and core switches will be 40G Ethernet, they say.
The top-of-rack switches could ship this month, while the core switches are expected to ship late this year.

Juniper declined to comment for this story.

With Stratus, Juniper is looking to essentially deconstruct three-tier data center switching architectures into two, and eventually one. This is intended to increase performance and reduce operational time and cost by eliminating the need to deploy and manage additional products.

Sources say the Stratus switches will support a common control plane and a virtual data plane across Layer 2 switches. While physically dispersed, each switch port will function as if it is one virtual hop away from any other port.

The switches will also feature port-level virtual Layer 2/3 services that can migrate with workloads, sources say. This is similar to the virtual machine service profile mapping feature Cisco and Brocade already support, or announced support for, on their respective Nexus and VDX data center switches.

"If you think about Juniper's Virtual Chassis, I believe it extends these capabilities across a fabric," one source said. "The added benefit is L3 services, especially for security."
Virtual Chassis allows several fixed-configuration Juniper EX switches to be combined into a single logical switch for increased scale and density, and to reduce a three-tier switching architecture into two tiers. Juniper has disclosed plans to extend this capability across more EX switches and MX routers.

Virtual Chassis and new Juniper techniques in the Stratus switches to replace Spanning Tree in Ethernet data center networks will be offered as an alternative to the IETF TRILL specification. TRILL, or Transparent Interconnection of Lots of Links, is designed to overcome the slow topology reconvergence times associated with Spanning Tree by enabling shortest-path multihop routing for large-scale Ethernet and Fibre-Channel-over-Ethernet data center networks.

Juniper, however, is not a proponent of TRILL while Cisco and Brocadehave both said that their shortest-path multihop techniques are both based on TRILL.

08 March 2011

What to Expect in Cloud-Based Communications in 2020


Computerworld — BARCELONA -- John Donovan, AT&Ts chief technology officer, is making what he calls "creepy" and "spooky" -- but ultimately useful -- predictions for wireless computing and communications in the cloud in 2020.

In one broad scenario, Donovan said wireless users could store all kinds of data about their lives in the cloud and authorize various algorithms and computing systems to analyze it for later use to communicate and remind us of names, addresses, arcane facts and other important, and less important, tidbits.

That capability implies that personalized mobile phones and tablets that we carry around today won't be necessary, he said in an interview at Mobile World Congress.
In his outlook, someone could drive to dinner at a friend's house and use a wireless device, perhaps over a TV, to make a call or send a message by entering a password or fingerprint scan. The device would then find all of the caller's personal information in the cloud, including the phone number or e-mail address of whomever was being called. Even the names of the other person's children would be accessible, for example.

"Answers to everything will be at our fingertips, and [the information will be] more mobile and more ubiquitous," Donovan said.

A consequence of this repository of information in the cloud is that people will become independent of devices like smartphones and tablets, Donovan said. "Software will converge and devices will disintegrate and we'll have fewer devices that belong to us anymore," he said. "I don't see the need to carry mobile devices to visit you at your house, and I'll borrow one you have and authenticate myself on it."

AT&T is already experimenting with the cloud concept in its labs, he said. One experiment relies on information that carriers have known for years about calling and data usage patterns, including how a preponderance of people call home on Sunday nights.
That kind of patterns analysis will ultimately make communications and access to information more convenient, Donovan said. "If today I always answer calls from home at my work, then the phone will continue to ring there, but if I never answer your calls or e-mails, then I should never hear [or see] them," he said. "Time of day, day of week, location, business versus personal ... these things are not terribly complex [to analyze]."

AT&T's lab work has already used Donovan as a guinea pig. Engineers in recent weeks took all of Donovan's communications, including calls and e-mails, and uploaded logs from them, reading them not for words but for patterns. The lab analysis spit out a list of Donovan's top 30 best friends, ranked from 1 to 30. To his relief, "my wife was at the top," he said.

Venture Capitalists Flock To Cloud Computing Startup

Dell was born in a room at the University of Austin. Facebook originated from a Harvard dorm. Box.net, tracing its origin to a room in the University of Southern California, may or may not become as large as Dell and Facebook, but if recent indications are anything to go buy, people are willing to bet big money on it.
Box.net, a Silicon Valley startup that offers online file-storage and collaboration software, announced that it has obtained $48 million in fourth-round funding to finance its aggressive expansion plans. Meritech Capital Partners contributed $38 million in equity investment, with Emergence Capital Partners and Andreessen Horowitz, the fund operated by Internet pioneer Marc Andreessen, making up the balance. Hercules Technology Growth Capital Inc. has promised $10 million in debt financing. This increases the total amount raised t0 $77.5 million till date.
To put things into perspective, Meritech Capital had earlier invested in Facebook, NetSuite and Salesforce.com. Andreessen is on the boards of Facebook, eBay and HP among others. Therefore, it is clear that the investors have an established record of backing winners.
"We're going to be investing quite heavily in our infrastructure, in our R & D, and in our product, to build the next generation of the way that businesses are going to manage and share their data in the cloud," said Aaron Levie, the 26-year old CEO of the company. He and his friend Dylan Smith founded the company in 2005 with $11,000 won in a poker game.
Since then, Box.com has grown by leaps and bounds among people desiring to store data online. While the basic version of Box's service is free, it does charge customers $15 a month for extra features like more storage or security enhancements. As of end-2010, it has more than 5 million users at 60,000 companies, including 73% of the Fortune 500.
The company intends to use the cash to double its 140-person staff over the next 18 months, expand overseas and accelerate development of its mobile applications. It will also look to build up an enterprise sales force that will try and convert more free users into paying customers.
The CEO has big dreams and expectations for his company. "This is a revolution that is democratizing enterprise software – the cloud has dramatically leveled the playing field for the delivery of services, and for the first time, technology adoption in the enterprise is being driven by the bottom-up," he wrote in a blog post. "….Box's beginnings were modest..We must invest aggressively to continue this success. We are no longer a small startup, but a 140-person strong organization that must do everything in its power to bring better technology to the enterprise."
By Sourya Biswas

Read More...

How Cloud Computing Can Create Jobs

"Few companies that installed computers to reduce the employment of clerks have realized their expectations…. They now need more and more expensive clerks even though they call them 'operators' or 'programmers.'"
- Peter Drucker (1909-2005), American management guru and author.
As Peter Drucker wrote, new technologies do not reduce employment; they simply raise the level of skill required to be employed. Those who can change, prosper; those who can't, have to settle for low-paying jobs or join the ranks of the unemployed.
Take IT maintenance for example. With cloud computing, businesses will no longer have to maintain employees for IT maintenance as that will be taken care of by the service provider. But wait, doesn't that mean there will be fewer jobs in the aftermath of migration to cloud computing?
No, it's not as simple as that. The study of Microeconomics shows that any money thus saved would likely be invested back into the business. The resultant expansion would create more jobs than is lost by axing the IT maintenance employees.
Moreover, with cloud computing there will come cost savings. And these savings, once cycled back in the business, will create economic value far exceeding the quantum of savings. As any economics student will explain, there's a multiplier at work here.
According to a report released by storage solutions giant EMC in association with the UK-think tank Centre for Economics and Business Research (CEBR), cloud computing can generate 2.4 million jobs in the next four years in the EMEA (Europe, Middle East and Asia) region, 300,000 of them in the UK alone.
This report, titled 2011 Cloud Dividend, examined several key markets in Europe, including France, Italy, Spain, Germany and the United Kingdom. In fact, adoption of cloud computing by just these five countries can generate more than $243 billion in annual revenue, with Germany getting the biggest share, it opined.
"One of the key drivers for economic recovery will be job creation," said Oliver Hogan, CEBR's managing economist. Migration to cloud computing is expected to produce 446,000 jobs each year in Europe, Middle East, and Africa until 2015.
"A critical element in businesses achieving the competitive advantage presented by cloud computing lies in the successful virtualization of mission-critical and revenue-generating applications," said Sandra Hamilton, EMC's vice president for Europe. "That will be key to realizing the full cloud dividend – and to deriving the powerful growth and productivity gains which, as the new report shows, can lead to meaningful job creation across the EMEA region."
The report also looked at how different industries will be affected by such a move. The distribution, retail and hotel sector will benefit the most from cloud computing, with more than $320.5 billion in earnings and 354,790 jobs generated, according to the study. BFSI (Banking, Financial Services and Insurance) is second with $253 billion, with government, education and health care some distance away with $155 billion. However, the public sector will be the biggest gainer in terms of job creation, with 801,000 positions expected to be created over the next five years.
Overall, the future seems to be quite bright, both for companies and their employees, if they make the move to cloud computing. Of course, the employees may have to develop themselves in skills different from those that are in demand today. But one things is clear – the properly trained employee won't lack for a job.
By Sourya Biswas

Read More...

07 March 2011

VCs Write $10m Check to Hybrid Cloud Storage Start-up




Five-year-old Egnyte and its hybrid cloud file server - a mix of cloud and on-premise NAS storage larger accounts are finding attractive although the start-up meant to focus on SMBs - have closed a $10 million B round led by Kleiner Perkins' Apple-partnered iFund. Existing backers Floodgate and Polaris kicked in.
The money should pave Egnyte's way into international markets and pay for engineering and domestic sales and marketing efforts.
The start-up says it quadrupled its customer base in 2010. It reportedly has five billion files stored in its cloud network and says more than two million files are uploaded and downloaded a day. That's 3,000 files a minute.
Now it's moving to the channel.
Egnyte's solution offers organizations a corporate-wide file storage, sharing and backup solution combining the speed and security of a local file repository with cloud access, flexibility and scalability.
Egnyte has already charged into mobile space with iPad-, iPad 2-, iPhone- and Android-supporting apps and is integrated directly into existing file storage infrastructures managed by existing directory services. It automatically synchronizes changes made to either local and cloud files.
Egnyte got a $6 million A round that included Maples Investments.
Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

A History of Cloud Computing


"History is written by the victors."
- Winston Churchill (1874-1965), British Prime Minister during WWII.
History, especially if it deals with victories and defeats, is inherently flawed. No one knew this better than Winston Churchill, who was assured of his place in the historical tomes by his victory in WWII. As he put it, "History will be kind to me for I intend to write it". Therefore, are we taking a leap of faith by trying to encapsulate the history of cloud computing in mere words? Is it a wasted effort?

Not really. In our opinion, the best time to trace the growth of any paradigm, whether technology or culture or any other human endeavor, is when it has grown out of infancy but not attained maturity. For it is then that prejudices will not be able to color our opinions, whether positive or negative.

Since cloud computing is gaining acceptability by the day, it is no longer a beginner in the IT infrastructure space. Again, since there are many who resist the technology citing several concerns like security and availability, it hasn't matured either. In other words, there's no time like the present in presenting its history.

The general idea behind the technology dates back to the 1960s, when John McCarthy wrote that "computation may someday be organized as a public utility." Then, grid computing, a concept that originated in the early 1990s as an idea for making computer power as easy to access as an electric power grid also contributed to cloud computing. For a detailed look at the differences between utility, grid and cloud computing, look at "Cloud Computing vs Utility Computing vs Grid Computing: Sorting The Differences."

The term "cloud computing" was most probably derived from the diagrams of clouds used to represent the Internet in textbooks. The concept was derived from telecommunications companies who made a radical shift from point-to-point data circuits to Virtual Private Network (VPN) services in the 1990s. By optimizing resource utilization through load balancing, they could get their work done more efficiently and inexpensively. The first time the term was used in its current context was in a 1997 lecture by Ramnath Chellappa where he defined it as a new "computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits alone."

One of the first movers in cloud computing was Salesforce.com, which in 1999 introduced the concept of delivering enterprise applications via a simple website. Amazon was next on the bandwagon, launching Amazon Web Service in 2002. Then came Google Docs in 2006 which really brought cloud computing to the forefront of public consciousness (See: Cloud Computing and Google Docs). 2006 also saw the introduction of Amazon's Elastic Compute cloud (EC2) as a commercial web service that allowed small companies and individuals to rent computers on which to run their own computer applications.
This was soon followed by an industry-wide collaboration in 2007 between Google, IBM and a number of universities across the United States. Next came Eucalyptus in 2008, the first open source AWS API compatible platform for deploying private clouds, followed by OpenNebula, the first open source software for deploying private and hybrid clouds.
2009 saw Microsoft's entry into cloud computing with the launch of Windows Azure in November. Now, suddenly, there were major players jumping on to cloud computing from left, right and center.

04 March 2011

Security for the Cloud: Data Integrity, Data Resilience and Data Security


Data Integrity, Data Resilience and Data Security Given the recent rise in popularity of cloud-based solutions, it is not surprising that there is a concern in the minds of many people regarding the security of their data. This is safeguarding the data against: Unauthorized access Data corruption or loss Hardware failure In reality, you will [...]

03 March 2011

Is Cloud Computing Losing Its Value Proposition?


Cloud Computing Journal
We all know that hardware prices always comes down – but have you noticed any such trend in cloud computing? So, what happens when hardware prices keep coming down and cloud services pricing remains steady? The cloud value proposition of lowering IT costs slowly disappears … right? There is no doubt that the range of services available has significantly increased but on the price front there has been very little movement. Yes, if you want to try cloud for free then you have more alternative today – for example Amazon has introduced a micro instance. Google App Engine prices have remained more or less the same since the launch. Same is true for Microsoft Azure. Amazon is not much better. Since the launch of the AWS there has hardly been any price reduction. Between mid and end 2009 Amazon announced around 15% price reduction on a range of its services is probably the only instance.
read more

02 March 2011

MJC Becomes A Meraki Gold Partner


 – MJC http://www.mjcgroup.co.uk/

 have long been known as specialist suppliers of technology to the Education Industry and now offer the latest wireless technologies for ease of use, performance, and affordability. Meraki Solutions offer top of the Line 802.11n access points with enterprise-class chipsets, hardware-accelerated encryption, and Priority Voice QoS, resulting in Ethernet speed without the wires. "We are excited to partner with MJC to bring Cloud Networking to the Education Sector," said Adam Davidson, Meraki's Director of Sales for EMEA.
Each access point is constantly monitored and optimised from the cloud, so the network automatically adapts to changing interference conditions. Multi-radio, multi-channel mesh routing and automatic mesh failover offer fault tolerance, and provide fast coverage in hard-to-wire areas.
MJC's Product Manager Nigel Ward explains how their new Meraki wireless solutions http://www.mjcgroup.co.uk/meraki-wireless are easy to deploy and manage - "When you plug in a Meraki access point, it automatically connects to the Cloud Controller over the web, downloads its configuration, and joins your network." He adds "As a result, Meraki deployments take hours - not days or weeks. Automatic cloud-based optimisation and monitoring, and intelligent, self-healing APs keep the network fast, reliable, and hassle free."
In addition because this wireless solution is centrally managed over the web, you can configure, manage, and monitor APs anywhere in the world, right from your browser. Built-in multi-site support lets you painlessly manage branch locations over the web. Even if you have a large campus, or lots of branches to cover, MJC's Meraki Solutions have highly scalable central management architecture, whereby it is as easy to configure 10,000 APs as it is to configure 10. Powerful tools provide deep visibility into all corners of your network, and real-time monitoring and alerts keep constant watch over your entire network.
Meraki networks are easy to use, yet fully featured for enterprises, with capabilities like RADIUS integration and intrusion detection. Even advanced functionality like policy firewalls and guest management are easy to configure with intuitive web-based management.
Meraki is the first application-aware wireless LAN. Each AP includes an integrated layer 7 packet inspection, classification, and control engine, enabling you to set QoS policies based on traffic type. Easily prioritise your mission critical applications, while setting limits on recreational traffic, e.g. peer-to-peer, music, and video streaming.
In addition, traditional controllers are often the most costly component of a wireless network - even before factoring in maintenance, support, and the value of your time. Meraki's hosted controller architecture eliminates the cost and complexity of on-site controllers, providing highly-available, scalable centralised management at a fraction of the cost.
To arrange a free wireless survey with a Data Communications Specialist at MJC please call 0121 258 4780 or email meraki@mjcgroup.co.uk .
Since 1983, MJC Group have been providing world class technology. With IT Solutions, Business Solutions for document management and Data Communications, our divisions blend seamlessly to provide expertise that ensures maximum efficiency.

01 March 2011

Cloud Computing vs Utility Computing vs Grid Computing: Sorting The Differences


The Pacific Ocean is a water body, but not all water bodies are Pacific Oceans. This may be oversimplifying the situation but you do get the drift.Grid computing and utility computing, though sharing several attributes with cloud computing, are merely subsets of the latter. They may also be considered as implementations of cloud computing, rather than being different names for the same technology.
Before analyzing this further, it is necessary to define grid computing and utility computing. It is noteworthy that the nomenclature of both grid and utility computing is derived from the electricity system. The first term originated fromIan Foster's and Carl Kesselman's seminal work, "The Grid: Blueprint for a new computing infrastructure" (2004), as a metaphor for making computer power as easy to access as an electric power grid. "Utility computing" originates from the process of making IT infrastructure and resources available as a metered service similar to a traditional public utility like electricity.

Grid computing
 can be defined as the use of computer resources from multiple administrative domains to reach a common goal. It can be considered as a distributed system with non-interactive workloads involving a large number of files, yet more loosely coupled, heterogeneous, and geographically dispersed as compared to cluster computing. In its simplest form, grid computing may be represented as a "super virtual computer" composed of many networked loosely coupled computers acting together to perform humongous tasks.
Utility computing involves the renting of computing resources such as hardware, software and network bandwidth on an as-required, on-demand basis. In other words, what were earlier considered products, are treated as services in utility computing. The idea was first propounded by American computer scientist John McCarthy of MIT as early as 1961, when he had said, "If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility… The computer utility could become the basis of a new and important industry."
Although both grid computing and utility computing were precursors to cloud computing, nowadays they can be considered as implementations of the latter (but not always, as I've explained later). For cloud computing does everything grid computing and utility computing do, and much more. For example, cloud computing is not restricted to specific networks, but is accessible through the biggest network of them all – the Internet. Also, virtualization of resources and its consequent advantages of scalability and reliability are much more pronounced in cloud computing.
Note that utility computing can be implemented without cloud computing. Consider a supercomputer that rents out processing time to multiple clients. This is an example of utility computing as users pay for resources used. However, with only one location and no virtualization of resources, it cannot be called cloud computing. At the same time, grid computing may always be considered a weaker form of cloud computing as there's always some virtualization involved. However, chances of a grid failing due to failure of a single location that may be considered more important than the others is a distinct possibility, unlike cloud computing where redundancy makes such situations manageable.
At the end of the day, we can say that grid computing is a weaker form of cloud computing, bereft of many of the benefits that the latter can provide. As for utility computing, it may be considered more of a business model than a specific technology. Although cloud computing supports utility computing, not all utility computing is based on the cloud.
By Sourya Biswas

Read More...