28 December 2010

Pareto Launched the Definitive Guide to Telework! Check it out here: http://info.paretonetworks.com/ebook.html

The new eBook “The Definitive Guide to Telework: A Pareto Networks Workbook,” contains all you need to know about telework to ensure you can make the most of your telework initiatives. It contains all the latest research and information around telework to help you understand:
  • Telework benefits – learn why organizations are considering telework roll-outs
  • Barriers to deployment – understand what’s hindering/slowing its adoption
  • Technology options – be sure you can pick the best solution for your requirements
  • Checklist considerations – get prepared to maximize the success of your telework initiatives


Join Us: http://bit.ly/joincloud

27 December 2010

Meraki Adds Client Location Services to Cloud-Controlled ...

Meraki, the cloud-based networking company, today announced the addition of Client Location Services to its cloud-based wireless LAN solution.

Knowing the physical location of mobile assets saves valuable time and money in many environments. In healthcare organizations, costly medical equipment must be tracked down to serve patients and for maintenance. In retail environments, handheld scanners can easily be misplaced, and in offices and classrooms, laptops and iPhones go missing all too frequently. In addition, detailed knowledge of client location can be invaluable to helpdesk personnel for troubleshooting client issues.

Meraki’s Client Location Services allows administrators to locate wireless clients quickly and easily, using existing Meraki wireless LAN infrastructure. With Client Location Services, administrators can view the location of wireless clients on a custom floor plan or map. Additionally, Client Location Services includes Google Maps integration for tracking clients in campus or outdoor deployments – administrators can view clients on map, satellite, or hybrid views via a simple toggle. Locations are viewed via Meraki’s secure browser-based

Client Location Services integrates seamlessly with Meraki’s Client Fingerprinting technology, which dashboard, making Client Location Services ideal for distributed, multi-site, and remotely managed environments enables administrators to identify clients by device name, operating system, etc. An administrator can, for example, locate an individual client by searching for “John’s iPhone”, or search for “Windows 7″ to see the locations of all laptops running Microsoft Windows 7.

“Client Location Services is a powerful new addition to Meraki’s wireless LAN,” says Slade James, Network Services Supervisor at Coronado Unified School District in California. “The fact that we woke up one morning and found that our network was upgraded with this new capability really hammered home the magic of cloud-based networking. We continue to find new uses for this capability, like tracking down clients that do not meet our standards for acceptable use or system configuration.”

Key Client Location Services benefits:

•Uses existing wireless LAN infrastructure – no new hardware or software to purchase or deploy, and no per-device licensing

•Built-in Google Maps and custom floor plan support

•No configuration, setup, or training required

•Scales to tens of thousands of clients per network

Client Location Services operates by intelligently triangulating client location using the signal strength of multiple access points that “see” the client. Location can be determined with as few as three access points, making Client Location Services equally suited to small branches and large campuses. Advanced triangulation techniques, employ weighted averages and AP selection algorithms to ensure accuracy.

Client Location Services provides location tracking for WiFi clients using an existing Meraki wireless LAN, with no added cost and complexity. As a certified technology partner of Real Time Location Services (RTLS) solution providers, Meraki wireless LANs also integrate with third party RFID-based systems, enabling tracking of non-WiFi assets such as golf carts, hospital beds, and heavy machinery.

Client Location Services is available immediately to all Meraki Enterprise Cloud Controller customers.


Join Us: http://bit.ly/joincloud

23 December 2010

Panda Cloud Antivirus – Promising Cloud Security Tool

There has been an exponential growth in new viruses and malicious software that are appearing everyday on
different IT platforms. Curbing this problem hasn’t been easy; it’s more like a game of cat and mouse. Newer threats and malware is created everyday and as a counter-measure, newer updates and patches of different anti-virus software are available. Keeping a large database free of malware requires an antivirus that provides the required level of protection without compromising the computational speed of the cloud; Panda Cloud Antivirus is one software that has the capability to do that.

“Panda Cloud Antivirus is the best free antivirus software available… the next evolution of anti-malware technology”.

Editors’ Choice “Best Free Antivirus” November 2009 & June 2010, PC Magazine USA.

Over a year ago, the utility took the market with a storm when it became the first free cloud-based utility in the category of Antivirus. Panda Cloud’s strong competitors (AVG, Avast, Avira) couldn’t keep up the rivalry when the 1.0 version of Panda Cloud got the Editor’s choice for Best Free Antivirus. The simplistic view of the application was one of the striking features of the software. Unlike most Antivirus software where you find an endless display of complicated buttons, Some of the new Features for Version 1.3

Malicious Web & URL Filtering.
Unified Recycle Bin and Quarantine.
Automatic and transparent upgrades to new product versions

Behavior based detection is also one of the features which allows instant identification of online threats. The scanning speed and efficiency of the software is also impressive. The utility identifies around 90% of the Malware and the scan times are also on the lower side. The progress bar is quite interactive and shows the threats that are being neutralized by the utility.

After the scan a detailed report is generated that lists the number of threats identified and the consequent action taken. A step by step guide is also available which advices the user which action to take.

A professional version of the application has also been created which also includes features like USB auto vaccine and a deep analysis of running processes which justifies the price of this application. This way, Windows auto run is disabled and execution of potentially unwanted software is restricted. If an unwanted file execution is detected, it can be countered within seconds (at most, 30 seconds). The advanced settings panel included in this version beefs up the security and Malware detection by adding more features to the software. The newer version of Panda Cloud requires a little more processing power to operate than its preceding version.

Apart from the price tag it carries, this Antivirus earns a good repute. The free edition isn’t any less impressive than the paid edition but overall the software is highly recommended for all type of users regardless of the fact that you are a beginner or a network administrator.


Join Us: http://bit.ly/joincloud

22 December 2010

OpSource Announces Enterprise-Class IaaS Managed Services for the Cloud

Infrastructure as a Service is set for rapid growth in 2011 as businesses and developers transition from running their own capital intensive IT hardware and systems to “pay-as-you-go” Cloud-based IT infrastructures. OpSource have recently announced their Enterprise Class Managed Services for the Cloud, a full suite of application and system management services businesses demand to migrate and scale mission-critical applications in the public cloud.

This offering enables organizations to experience the cost benefits of the public cloud while mitigating the security, control and integration issues that have inhibited the adoption of cloud services in the past. OpSource is one of the first public cloud providers to offer this combination of application and system management services in the cloud, which can be customized by choosing from two different levels:
OpSource Tech Ops™: provides smaller organizations with the system monitoring, server administration, and OS support required to get up and running with cloud computing while minimizing operational risk.
OpSource App Ops™: offers the application deployment, change management, data management, performance management, optimization management and compliance services that organizations need to scale cloud operations while minimizing operating expense and risk.

OpSource is making it easy and affordable for organizations to get up and running with cloud services by offering a pricing structure that scales with cloud computing usage. Managed Services for the Cloud are priced between $0.05 and $0.60 per CPU hour, based on volume and term commitments. OpSource’s “pay-as-you-go” pricing structure is a system that allows an organization to pay for enterprise-quality computing capability when they need it, paying for it only when it is being used. This flexible, frictionless sales model reduces the pricing barriers for many organizations, enabling companies to more seriously consider cloud-based managed services.

Gartner estimates the market for cloud computing services will grow over the next few years, reaching $150 billion by 2013(1). Research VP for Gartner, Lydia Leong commented on the migration to cloud-based services: “Adoption of cloud infrastructure services is accelerating across businesses of all sizes. However, many organizations do not want to self-manage their infrastructure and cannot fully realize either the cost or agility benefits of the cloud without managed services. Some businesses lack the in-house expertise to manage their cloud servers, while others seek to free their IT organization of the need to do systems administration tasks that add no business value. A ‘menu’ of managed services on top of cloud infrastructure allows businesses to select the solutions that best fit their needs.”

In a move to further ease the transition to cloud services, OpSource also announced the availability of Hybrid Connect™, a secure Ethernet VLAN connection that links OpSource’s Managed Hosting and Cloud Hosting environments. With Hybrid Connect, organizations can leverage the flexibility of the public cloud and the performance of dedicated physical servers in the OpSource Managed Hosting environment.
Opsource Treb Ryan OpSource Announces Enterprise Class IaaS Managed Services for the Cloud“With Managed Service for the Cloud, we now have the ability to align our customers’ unique application requirements with the right combination of managed services and hosting resources,” said Treb Ryan, CEO for OpSource. “This enables us to provide companies with a customized solution that can help them to more confidently deliver their web-based services from the cloud without the complexity, operational issues and expense associated with supporting these applications on their own infrastructure. When you combine our managed services with our flexible pricing models and ability to scale service on demand, OpSource has removed any remaining barriers to enterprise adoption of cloud services today.”

For more information or to find out how to purchase OpSource Managed Services for the Cloud or Hybrid Connect, please visit www.opsource.net.


Join Us: http://bit.ly/joincloud

21 December 2010

Consumer Cloud Is Killing Enterprise Cloud

Richard Stallman's recent characertization of the impending Google Chrome OS as "careless computing" brings the careless use of the term "cloud computing" into sharp relief. Stallman says that the term "is devoid of substantive meaning," and thereby prone to uses that are less than un-evil. I agree with his sentiment but would argue that the opposite is true: there is too much substantive meaning in the term "cloud computing." It is an umbrella term that covers everything from free email and Facebook to software applications delivered from the sky to the very serious business of enterprise-grade resource consolidation and provisioning.

 A Scrutable Term Cloud computing is the first non-inscrutable IT term I can remember. It doesn't conjure up the sheer geekiness of SOA, Ajax, object-oriented programming, the Java/Javascript confusion, the PHP/Perl/Python triplets, RAID, SATA, LANs, or even the minor shoptalk inherent in terms like USB or Flash drive. "The Internet is the Cloud, and Cloud Computing comes from the Internet" is all you have to say. Your average 6-year-old or Senator can understand it. You might even be able to teach it to goldfish or Dogs Watching TV. Therein lies the problem. It's mildly annoying when the term is appropriated by every technology company on the planet to mean precisely what that company has been doing for years.

For example, I was talking to a software CEO the other night whose company really does offer Cloud Computing, and asked him about his recent appearance at a Cloud event in Asia. "I was really annoyed by many of my fellow presenters, who just use the term 'Cloud Computing' to describe whatever it is they do," he said. Well, we should be used to that now, whether we're being sold the old Lotus Notes as new cloud computing, or a big local data center as "cloud in a box," or dumbed-down, decades old desktop apps as Office365. The Enterprise Cloud I appreciate a good joke as much as anyone, and it's enjoyable to watch the mad marketing scramble for the cloud stratosphere.

In the end, the pitches mean nothing if customers don't want it. Use of the term "Cloud" might get your foot back in the door, but the deal won't close until you jump through the same hoops you've been jumping through for years. But all this jollity refers to enterprise cloud computing: IT guys buying stuff, and deciding how much of it (if any) they want to virtualize, how much (if any) they want to locate off-premises, and whether the perceived benefit of minimal upfront cost trumps their concerns over long-term costs and security. Here Comes Trouble The real trouble starts when the term Cloud Computing refers to consumer computing. The Cloud takes a different, malignant new shape here. Riding the analogy, it's as if all those beautiful puffy cumulus clouds in the shapes of faces and lambs and rabbits suddenly turned into a large, threatening, anvil-headed thunderstorm. Weather people use the term cluster to describe a group of these storms; it seems appropriate to much of Cloud Computing discussion today.

 What's the big problem, and why is this distinction between enterprise cloud and consumer cloud so important? In a word, "privacy." Privacy is not a concern with enterprise cloud computing per se. Security and data integrity are, to be sure, and privacy of company information is the highest priority, more important than any performance consideration. Yet these concerns revolve around whether or not a cloud computing solution--whether hosted onsite or offsite--will be leakproof and hackproof. There is no question of whether a third-party host will try to invade the company's data privacy.

 On the other hand, privacy is the alpha and omega of consumer-cloud concerns. We've known for years that our web surfing has been tracked (and who knows to what degree actively monitored). We know that anything we send via Yahoo mail, gmail, etc. can and will be read by the Feds if they think--or want others to think--that we're up to no good. Here Comes More Trouble Meanwhile, half a billion people have guilelessly put their life stories onto Facebook, often allowing anyone to peer in. Does no one realize there are sexual predators, Feds, and other creeps who like nothing better than to snoop around? Is no one aware that a Michael Phelps bong-hit incident could easily lie in their future (for the small percentage of people who are inclined to such activities)? Many influential voices have already raised hackles about the Cloud. John C. Dvorak, for example, called users of cloud applications "suckers" back in 2008, then promised not to complain about the Cloud anymore. His most recent complaint was published just this November, in a column in which he noted, rightfully, that a very bad thing about the consumer cloud is that cloud companies can disappear--and with them, all the data they've been safekeeping. He used Drop.io as the example. He could have been writing about blog hosters that went kaput, thereby tolling the bell for brigades of earnest writers. Maybe someday he'll be writing about the demise of Facebook, or YouTube, or Twitter (oh wait, the Library of Congress is cataloging that one, thank Buddha). Meanwhile, shopping online has grown, as far as I can tell, to about $200 billion annually in the US alone.

This number is roughly the size of the entire economy of Malaysia, approaching that of Portugal. And this is Cloud Computing, folks. Concerns a decade ago about giving up our credit-card numbers online have been replaced by a faith that the banks and stores--whatever their faults--are not unduly invading our privacy. The fact that each of us leaves a digital footprint as obvious as size-13 bootprints in soft mud outside an unlocked window doesn't trouble those "who have done nothing wrong." And Here's the Real Trouble But here's where two things converge: less reliable companies and a more intrusive government. Facebook and Google don't pretend to be interested in your privacy. They're hardly alone, as evidenced by increasing numbers of websites who encourage you to sign in via Facebook--thereby sharing your information with them.

Meanwhile, Google continues to map every block of the world; I hope it doesn't catch me peeing on a wall outside of a bar in, say, Dublin or Indianapolis in broad daylight some day, as that would be bad for my career. I consider this abject privacy invasion to be in violation of the Fourth Amendment; maybe some lawyer will take me up on this some day, as I'm too busy and too dim to go to law school. The more intrusive government aspect doesn't need much elaboration. Let's just say that the US government feels free to touch your junk, whether real or virtual, at any time of its choosing. The Patriot Act meant that my local librarian couldn't remind me of which books I read. I consider this to be a violation not only of the Fourth Amendment but every principle upon which the US was founded. Sadly,

I expect more of the same from the US government and others, as they strive to "protect our freedoms," just as I expect more of the same from companies interested only in "improving the customer experience." This "Free" Thing Bugs Me I think a large part of the problem lies within the idea that most things on the Internet should be free. This mindset seems to be killing newspapers. It also makes so many people quite willing to trade in their privacy for free stuff. But nobody thinks enterprise cloud computing should be free. Less expensive, yes. Free, no. Nobody is worried about Cloud providers snooping on company data, although I'm guessing there will be emerging concerns about the government's ability to do so. There are many advocates (including me) who do believe the enterprise cloud computing has the potential to revitalize the world's large, moribund economies, and catalyze growth in developing nations. I do hope all future, thoughtful discussions make the distinction between Enterprise Cloud and Consumer Cloud. The difference is as big as the difference between that fluffy white kitten in the sky and that nasty storm that just tore the roof off of your house.

Like Load Balancing WAN Optimization Is a Feature of Application Delivery

When WAN optimization was getting its legs under it as a niche in the broader networking industry it got a little boost from the fact that remote/branch office connectivity was the big focus of data centers and C-level execs in the enterprise. Latency and congested WAN links between corporate data centers and remote offices around the globe were the source of lost productivity. The obvious solution – get thee a fatter pipe – was at the time far too expensive a proposition and, in some cases, not a feasible option. We’d had bandwidth management and other asymmetric solutions in the past and while they worked well enough for web-based content the problem now was fat files and the transfer of “big data” across the WAN.

We needed something else.

The problem, it was posited, was simply that there was too much data to traverse the constrained network links tying organizations to remote offices and thus the answer, logically, was to do away with trying to juggle it all in some sort of priority order and simply make less data. A sound proposition, one that was nearly simultaneously gaining traction on the consumer side of the equation in the form of real-time web application data compression.

Here we are, many years later, and the proposition is still sound: if the problem is limited bandwidth in the face of applications and their ever growing data girth, then it behooves the infrastructure to reduce the size of that data as much as possible. This solution – whether implemented through traditional compression techniques or data deduplication or optimizing of transport and application protocols – is effective. It produces faster response times and thus the appearance, at least, of more responsive applications. As the specter of intercloud and cloud computing and the need to transport ginormous data sets (“big data”) in the form of data and virtual machine images continues to loom large on the horizon of most organizations it makes sense that folks would turn to solutions that by definition are focused on the reduction of data as a means to improve performance and success in transfer across increasingly constrained networks.

No argument there.

The argument begins when we start looking at the changes in connectivity between then and now. The “internet” is the primary connectivity between users and applications today, even when they’re working from a “remote office.” Cloud computing changes the equation from which the solution of WAN optimization was derived and renders it a less than optimal solution on its own because it does not fit the connectivity paradigm upon which cloud computing is based - one that is increasingly unmanageable on both ends of the pipe. Luckily, decreasing data size is just one of many other methods that can be used to improve application performance and should be used in conjunction with those other methods based on context.

Because of the way in which WAN optimization solutions work (in pairs) they are generally the last hop in the corporate network and the first hop into the remote network. This is a static implementation, one that leaves little flexibility. It also assumes the existence of a matching WAN optimization solution – whether hardware or software deployed – on the other end of the pipe. This is not a practical implementation for the most constrained and growing environments – mobile devices – because as an organization you have very little control over the endpoint (device) in the first place (consider the consumerization of IT) and absolutely no control over the network on which it operates.

A traditional WAN optimization solution may be able to help specific classes of mobile devices if the user has installed the appropriate “soft client” that allows the WAN optimization solution to do its data deduplication trick. That’s feasible for corporate users over which you have control. What about the millions of end-users out there on iPhones, BlackBerries, and tablets over whom you do not have control. They are just as important and it is performance on which your organization/offering/solution will be judged by them. They’re an impatient lot, according to both Amazon and Google, and there are no studies to indicate that their conclusions are wrong, and have garnered enough mindshare to be awarded the right to run even the most stolid of enterprise applications:

Senior IT executives plan to make CRM, ERP and proprietary apps available to mobile devices
Ellen Messmer, Network World

Roughly 75% of senior IT executives plan to make internal applications available to employees on a variety ofsmartphones and mobile devices, according to new research from McAfee's Trust Digital unit.

In particular, 57% of respondents said they intend to mobilize beyond e-mail and make CRM, ERP and proprietary in-house applications available to mobile devices. In addition, 45% are planning to support the iPhone and Android smartphones due to employee demand, even though many of these organizations already support BlackBerry devices.

Even if the end-user is not using a mobile device, it’s likely that their connection to the Internet exhibits very different characteristics than those experienced by corporate end-users. While download “speeds” have been increasing in the consumer market, we know there’s a difference between throughput and bandwidth, and that there is a relationship between ability of the servers to serve and consumers to consume. That relationship is often impeded by congestion, packet loss, endpoint resource constraints, and the shared nature of broadband networks. It is simply no longer the case that we can assume ownership of any kind over the endpoint and certainly not over the network on which it resides.

And then you’ve got cloud. Cloud, oh cloud, wherefore art thou cloud? If you can deploy WAN optimization as a virtual network appliance then you have to be careful to choose a cloud that supports whatever virtualization platform the vendor currently supports. If you’ve already invested time and effort in a cloud provider and only later determined you need WAN optimization to improve the increased traffic between you and the provider (over the open, unmanaged Internet) you may be in for an unpleasant surprise.


Join Us: http://bit.ly/joincloud

20 December 2010

Wikileaks Attacks Prove the Cloud is Reliable

It's a strange world. When Amazon Web Services booted Wikileaks off its servers last week, many people (including me) said it raised significant questions about the rush into cloud services.

However, in a curious way, it's turned out very well for Amazon. The decentralized Operation Payback hacking group attempted to launch a Distributed Denial of Service (DDoS) attack against Amazon...and failed. In doing so, they proved that Amazon has the resources to cope, and thereby bolstered the reputation of Amazon Web Services (AWS), which provides Amazon.com's (AMZN) backbone.

In short, if you want an ultra-reliable cloud service that will resist a significant hack attack, then AWS is for you; Operation Payback has just proved it.

A posting on the now-suspended AnonOpsNet Twitter account read: "We cannot attack Amazon, currently. The previous schedule was to do so, but we don't have enough forces."

AWS had an uncharacteristic outage the other day, but that was apparently down to hardware failure. We'll have to take Amazon's word for that--although it seems that if practically any site goes offline for any reason, commentators are keen to find a connection with Wikileaks, however tenuous.

When referring to "forces," the AnonOpsNet group is referring to individuals who have downloaded and run the Low Orbit Ion Cannon (LOIC) software, which rapidly bombards a Website with requests. Combined with many other users, these requests push the site to its limits and sometimes take it offline. LOIC is ostensibly designed to stress test networks but can be misused easily. Modern Web servers and routing hardware is built to resist DDoS strikes, but there's little that can be done against massive attacks.

Anonymous saw a number of new recruits to its fold in the wake of major organizations withdrawing resources from Wikileaks, and the attacks taking place now are among the largest and most organized ever seen. Sites that have been targeted include Visa, MasterCard, PayPal, MoneyBookers.com, Tableau Software, Amazon.com and PostFinance, a Swiss financial institution.

Anonymous has also announced a wave of "fax bombing," whereby the fax machines of undesirable corporations are sent hundreds of messages. The goal is partly to use up the machine's toner but mostly to cause disruption.

These are very interesting times for the Internet, and the rulebook we once all though sacrosanct is being rewritten subtly. As strange as it sounds, cloud computing might just be proving itself at precisely the moment when it needs to.

If you intend to launch a Website, then hosting your files with a service like AWS might make a lot of sense, especially if hack attacks might be a concern (for example, if you target goods and services at certain sectors of the Internet community). Indeed, at the moment, the cloud may well be the best solution for avoiding DDoS attacks.

Keir Thomas has been writing about computing since the last century, and more recently has written several best-selling books. You can learn more about him at http://keirthomas.com and his Twitter feed is @keirthomas.


Join Us: http://bit.ly/joincloud

Oracle Takes on Microsoft, Google with Cloud Office

Oracle (ORCL) on Wednesday announced the availability of Cloud Office 1.0, a Web-based productivity suite that is set to give online applications from Microsoft (MSFT) and Google (GOOG) a fresh dose of competition.

Cloud Office is integrated with the on-premises Oracle Open Office, of which version 3.3 was also announced Wednesday.

Like Open Office, Cloud Office is based on ODF (Open Document Format). It provides a set of spreadsheet, text and presentation applications and is compatible with Microsoft Office, according to Oracle.

Customers can use Cloud Office to collaborate on documents over the Web as well as access them on mobile devices, Oracle said. Information on supported mobile devices wasn't immediately available.

Cloud Office stands ready for "enterprise and carrier-grade deployment" thanks to its "Web-scale" SaaS (software as a service) architecture, but is available in on-premises form as well, Oracle said.

It will be sold to business users as Cloud Office Professional Edition. Telcos and ISPs can offer their customers Cloud Office in Home, Standard and Professional Editions, according to an Oracle presentation.

A perpetual license for Professional Edition costs US$90, plus 22 percent annual maintenance fees. Cloud Office Standard Edition will cost $40 per user when sold through a telco or ISP, with support acquired through the provider. Home Edition is not currently available, according to an Oracle spokeswoman.

Cloud Office is also available via subscription at $40 per user per year for Professional Edition and $20 per user per year for Standard Edition, the spokeswoman said.

Cloud Office's cost could be key to its success against incumbent offerings like Google Apps for Business, which costs US$50 per user per year.

Meanwhile, new features in Open Office 3.3 include plug-ins for Oracle's BI (business intelligence), E-Business Suite ERP (enterprise resource planning) software, and Microsoft SharePoint.

While Oracle has a long way to go in catching up to competing office suites, it is hoping to close the gap by positioning its products as more flexible and open alternatives.

Open Office 3.3 Standard Edition costs US$49.95 per user and is meant for companies with one to 99 employees. The Enterprise Edition, which includes many more tools, connectors and supported platforms, costs $90 per user with a minimum of 100 users, although volume pricing is available.

But interoperability with Office comes at an additional price. Earlier this year, Oracle imposed a $90 per user fee on an ODF plug-in that enables the sharing of files between Open Office and Microsoft Office. The plug-in had been available at no charge under Sun's ownership.

Still, Oracle maintains customers can reduce their office productivity license costs by up to a factor of five by using Open Office.

Oracle has faced scrutiny this year from backers of OpenOffice.org, the open-source version of Open Office, with some fearing the company would stop supporting the effort.

A number of OpenOffice.org developers recently formed an offshoot project, LibreOffice. Oracle later publicly reaffirmed its commitment to OpenOffice.org.

Oracle gave no indications in Wednesday's announcement that Cloud Office will also be released as open source.


Join Us: http://bit.ly/joincloud

17 December 2010

Oracle butts into online collaboration space with Cloud Office

Move over Google and Microsoft: Oracle wants to get in on the cloud productivity scene too. The company has announced Oracle Cloud Office, which will allow users to create and edit documents collaboratively in the browser without having to rely on desktop software.

The 1.0 version of Oracle Cloud Office "enables web 2.0-style collaboration," says Oracle, and can be viewed (but apparently not edited) on smartphones and other mobile devices. The online service is compatible with both Open Office documents as well as Microsoft's Office format, though it's built around the open ODF format. Unlike services like Google Docs, though, it doesn't look like Cloud Office will be free to consumers—it's being offered to businesses and other service providers (like ISPs) as a way to create online presentations, spreadsheets, and other documents on the Web.

Speaking of Open Office, the company also announced Open Office 3.3, which has built-in integration with Cloud Office. The updated suite is largely geared towards enterprise users, with ways to connect to Oracle Business Intelligence, Oracle E-Business Suite, and Microsoft Sharepoint.

Although there are already a couple major players in the cloud productivity space, offering such a service for Open Office users is a good idea for Oracle. The company already tries to lure away Microsoft customers through the desktop version of its software, and adding collaboration tools will help Oracle keep those customers.

After Security, Network Bandwidth is the Next Cloud Bottleneck

Security concerns (real and imagined) have long dominated much of the cloud conversation and caused many companies to deliberate about getting started in the cloud. Slowly, the security issues are being addressed--through the adoption of corporate policies for cloud usage, maturing cloud provider offerings, and by technologies such as CloudSwitch which isolate and encrypt all cloud resources to meet the requirements of the CSO. But while the focus has been on cloud security, another potential bottleneck is on the horizon as companies start using the cloud in more substantial ways.

In our discussions with IT executives and their teams, we’ve been hearing about a new concern: the ability of corporate networks to handle cloud traffic. Network performance is a lurking issue that hasn’t yet received the attention it deserves. That’s understandable, since bandwidth is rarely a problem for companies exploring the cloud in a small way, where they may deploy a few experimental VMs in order to understand the process. But as they start expanding their cloud footprint and running production-oriented applications, data movement takes on a completely different scale. As enterprises start to move real workloads out to the cloud (or to straddle internal and external clouds), look for network performance to become top of mind.

IT professionals and developers often assume they have huge network capacity, and it’s probably ample for their current Internet usage or the small cloud projects they may have tried so far. But what will happen, for example, when you have dozens of developers all trying to use cloud resources? Or if you put high-transaction processes in the cloud that need to “talk back” to your data center? What if you are trying to move a lot of video or graphics between your business users and the cloud? Network usage is about to get much more demanding, and the traffic will need to flow without bottlenecks (or saturating the network) for an organization’s cloud strategy to work.

15 December 2010

Microsoft Gets Big Cloudy Federal Deal

Microsoft has brought home a bigger deal with the federal government than Google did last week when it hung a brace of 15,000 Gmail and Google Apps seats at the General Services Administration (GSA) on the wall.
Microsoft's got a contract for 120,000 e-mail, instant messaging, web conferencing and document-sharing seats with the Department of Agriculture.
It's supposed to be the largest government adoption of cloud software yet, according to Microsoft.
Google complained that it wasn't allowed to bid. It's already suing the Department of the Interior over an 88,000-seat messaging contract that went to Microsoft because it specified Microsoft.
Half the USDA already uses Exchange, which made its decision easier. Everything else is a confusion of other systems.
The deal should be worth about $27 million over three years, with the agency paying about $8 a head per mailbox. The USDA should save about $6 million a year.


Join Us: http://bit.ly/joincloud

13 December 2010

Cloud computing 'could boost EU'

Widespread adoption of cloud  computing could give the top five EU economies a 763bn-euro (£645bn; $1tn) boost  over five years, a report has said.

The CEBR said it could also create 2.4m jobs. The technology gives software  and computing power on demand over the net.
But experts warn that cloud computing can be very disruptive to business, and  companies could end up "disillusioned".
"Nothing kills a new technology better than a poor user experience," said  Damian Saunders of Citrix.


Are you more productive at home?

Jason Fried gave an interesting talk at Tedx Midwest, discussing how real work gets done at home. How so? It’s all about distractions. At home, people only have distractions that they choose to give into: TV, checking email, cooking elaborate lunches etc. At work, employees get distracted by M&Ms – managers and meetings. For example, managers can come by and completely disrupt a thought process. To really get something done, workers need a few hours of uninterrupted thought (about three). If not, employees are limited to a bunch of working moments until their next scheduled and unscheduled interruption.

But is home the perfect solution? According to The New York Times, in the article “Laptopistan”:

“At home, the slightest change in light is enough of an excuse to get up, walk around, clip my nails or head into the kitchen. Though home offices seem like the perfect work environment, their unrestricted silence, uninterrupted solitude and creature comforts breed distraction.”

What do you think? Where’s the best place to get work done?

At FreshBooks, employees need to be present in office so we can collaborate and create the best work we can as a team. We’re expected to be in the office almost all days as it’s key to FreshBooks’ culture and success. However, if we need to get something done, people may work at home to avoid the distractions of the office.

Bonus: If you’re looking for strategies on how to be more productive at home, watch Workday Nirvana by FreshBooks CEO Mike McDerment, who used to be a freelancer working from home. For Mike, it comes down to routine, discipline, and self-respect. The video was originally shown at International Freelancers Day.


Join Us: http://bit.ly/joincloud

10 December 2010

Teleworkers are happier than traditional office workers! http://bit.ly/hDm9jv

Employees who spend most of their time telecommuting are more satisfied with their jobs—mainly because they avoid workplace distractions and stressors like office politics, interruptions, constant meetings and information overload.

That’s according to new research by Kathryn Fonner, University of Wisconsin-Milwaukee professor of communication, and Michael Roloff, a professor of communication studies at Northwestern University.

The study compared the advantages and disadvantages of telecommuting over traditional office work. Participants who telework at least three days a week reported decreased work life conflict.

“Our findings emphasize the advantages of restricted face-to-face interaction, and also highlight the need for organizations to identify and address the problematic and unsatisfying issues inherent in collocated work environments,” Fonner said. “With lower stress and fewer distractions, employees can prevent work from seeping into their personal lives.”

In addition to implementing telework arrangements for employees, organizations may consider several other strategies to boost job satisfaction for both office-based and distance workers, Fonner adds, including:
  • limiting the number of meetings and mass emails
  • streamlining office communication by creating a repository of information that can be accessed at any time
  • designating certain time and space where office-based employees can work uninterrupted
  • encouraging employees to disconnect from workplace communication after hours.
The findings are published in the Journal of Applied Communication Research.


Join Us: http://bit.ly/joincloud

Called Cloud Computing Not Cheap Computing

The debate between private and public cloud is ridiculous and we shouldn’t even be having it in the first place.

There’s a growing sector of the “cloud” market that is mobilizing to “discredit” private cloud. That ulterior motives exist behind this effort is certain (as followers of the movement would similarly claim regarding those who continue to support the private cloud) and these will certainly vary based on whom may be leading the charge at any given moment.

Reality is, however, that enterprises are going to build “cloud-like” architectural models whether the movement succeeds or not. While folks like Phil Wainewright can patiently point out that public clouds are less expensive and have a better TCO than any so-called private cloud implementation, he and others miss that it isn’t necessarily about raw dollars. It’s about a relationship between costs and benefits and risks, and analysis of the cost-risk-benefit relationship cannot be performed in a generalized, abstract manner. Such business analysis requires careful consideration of, well, the business and its needs – and that can’t be extrapolated and turned into a generalized formula without a lot of fine print, disclaimers, and caveats.

But let’s assume for a moment that no matter what the real cost-benefit analysis of private cloud versus public cloud might be for an organization that public cloud is less expensive.

So what?
If price were the only factor in IT acquisitions then a whole lot of us would be out of a job. Face it, just because a cheaper alternative to “leading brand X” exists does not mean that organizations buy into them (and vice-versa). Organizations have requirements for functionality, support, compliance with government and industry regulations and standards; they have an architecture into which such solutions must fit and integrate, interoperate and collaborate; they have needs that are both operational and business that must be balanced against costs.

Did you buy a Yugo instead of that BMW? No? Why not? The Yugo was certainly cheaper, after all, and that’s what counts, right?

IT organizations are no different. Do they want to lower their costs? Heck yeah. Do they want to do it at the expense of their business and operational requirements? Heck no. IT acquisition is always a balancing act and while there’s certainly an upper bounds for pricing it isn’t necessarily the deciding factor nor is it always a deal breaker.

It’s about the value of the solution for the cost. In some infrastructure that’s about performance and port density. In other it’s about features and flexibility. In still others it’s how well supported it is by other application infrastructure. The value of public cloud right now is in cheap compute and storage resources. For some organizations that’s enough, for others, it’s barely breaking the surface. The value of cloud is in its ability to orchestrate – to automatically manage resources according to business and operational needs. Those needs are unique to each organization and thus the cost-benefit-risk analysis of public versus private cloud must also be unique. Unilaterally declaring either public or private a “better value” is ludicrous unless you’ve factored in all the variables in the equation.http://cloudcomputing.sys-con.com/node/1637502Join Us: http://bit.ly/joincloud

09 December 2010

Cloud Computing Acquisition: Cisco To Acquire LineSider

"Cloud computing represents a significant opportunity for Cisco customers to create more effective business models and increase the operating efficiency of the network," said Jesper Andersen, senior vice president of Cisco's Network Management Technology Group (NMTG), as Cisco this week announced its intent to acquire privately held LineSider Technologies, Inc., a leading provider of network management software that "helps customers build the network services necessary to securely create and deploy cloud computing infrastructure."
"With the acquisition of LineSider," Anderson continued, "Cisco will gain a key component to helping customers make this shift."
Based in Danvers, MA, LineSider will be bringing to Cisco advanced network management software that integrates both physical and virtual network services with a policy-based approach and makes networks more flexible and responsive to change. This - said Cisco in an announcement - will enhance its ability to rapidly provision network services.


Join Us: http://bit.ly/joincloud

Google and Its Cloudware Win Largest Federal Site Yet

GSA expects the deal, plucked out from under Microsoft, to cut its costs in half over the next five years

Google has won the General Services Administration (GSA) over to Gmail and Google Apps for Government.
The GSA, which is sorta like the federal government's quartermaster corps, said it's the first federal agency to move e-mail to a cloud-based system agency-wide.
It expects the deal, plucked out from under Microsoft, to cut its costs in half over the next five years and save it $15 million.
It will bring Google another 15,000 seats.
The agency said the widgetry better suits its mobile work force and is "in step with the administration's ‘cloud first' strategy."
The order is worth $6.7 million to Unisys which partnered with Google, Tempus Nova and Acumen Solutions. Unisys will provide the services and implement Google's software. It will tear out IBM's Lotus Notes and Domino software.


Join Us: http://bit.ly/joincloud

F5 Gets More Cloud-Friendly

F5 is making file virtualization more cloud friendly with the introduction of  software that translates storage protocols, making it possible to store files in  public or private cloud networks using a range of technologies.ARX Cloud Extender software runs on servers that sit between F5's ARX file  virtualization appliances and storage networks that may use different protocols  than are used by the devices the files are being sent from, the company says.

So if CIFS files are being stored in an Iron  Mountain (IRM) Virtual File  Store service cloud, the ARX Cloud Extender will make the protocol translation.  The software can handle any NSF or CIFS implementations as well as Iron Mountain  VFS and NetApp (NTAP) StorageGrid.

The software is expected to be available by the end of the year. F5 isn’t releasing pricing.

F5 is also opening up an application programming interface to its ARX appliance, which will enable customers to get new functionality from the devices. For example, using the API, a script could be written to compile the changes to a file or storage system since an application last scanned it. When
the application scans for an update, the script would feed it just the changes since the last scan rather than having the application scan the whole system itself, a more time-consuming option.

The API will be provided to customers as part of their maintenance contracts for the ARX, the company says.

F5 is announcing a virtual version of its ARX appliance that can be sold to OEMs to be bundled with  other products such as WAN optimization gear or file servers. Also, customers interested in an ARX
could readily download a trial comply of ARX to test before deciding whether to buy, the company says.
The virtual version supports VMware (VMW) virtual environments and will cost less than the ARX appliance, but F5 wouldn’t say how much it costs. It's available in the first quarter of next year, and comes in three models the 500, 2000 and 4000 for varying capacities.

Join Us: http://bit.ly/joincloud

A Year in the Clouds - How Cloud Computing Exceeded the Hype

We have now reached that time of year when the great and the good partake in the festive tradition of crystal ball gazing, as they predict the IT industry’s future trends for the next twelve months.
Over the next three weeks or so we will be deluged with various top tens, who will move, who will shake, who’ll hit tech heaven with the next iPad and who will reach tech hell with the next Sega Dreamcast.

It was about this time last year that seemingly every list published featured cloud computing as the number one game changer, the one trend that would have the greatest impact on the delivery of IT services. Some went as far as to predict that cloud should be viewed as the single most evolutionary computing development since the web itself was established. Not many argued against the list compilers rankings, but many viewed the prediction with a healthy pinch of cynicism.
It was Winston Churchill who once famously stated that “It is a mistake to try to look too far ahead. The chain of destiny can only be grasped one link at a time.”

We find ourselves one year on, with all of us having been bestowed with that marvellous gift of hindsight, and are now in a position to judge whether the soothsayers were on the money or whether Churchill’s cautionary note rings true.

So in 2010, did we reach for the cloud? The answer has to be a resounding yes, with the reality matching, and quite possibly exceeding, the hype.

Earlier this week, Angus MacSween, industry veteran and CEO of the UK’s iomart group plc told Dow Jones “I have never seen something happen quite as quickly as this. Six months ago around one-fifth to one-tenth of enquiries from potential customers related to cloud computing; now it is roughly nine out of ten.” He also stated that the attitude of firms’ IT departments has changed. “Whereas once they were reluctant to cede control of new projects, now they look to outsource to the cloud from the word go. We are witnessing a paradigm shift away from traditional on-premise models to the cloud”.
Join Us: http://bit.ly/joincloud

08 December 2010

Peeling Onions in the Cloud

From a conceptual standpoint, consumability through abstraction is arguably one of the most important benefits of cloud computing. The cloud offers up some collection of raw resources (i.e., servers, networks, storage, and applications) as a set of pre-configured, pre-integrated, and ready to use services. As a result, users typically need to know a good deal less about how those resources are setup, and can instead concentrate on consuming them to deliver their own set of services.

While the benefits offered by abstraction (namely consumability) are most certainly a good thing, abstraction can also be problematic. What do I mean? Well, while users understand the benefits they get from abstraction, sometimes they need to peel back the layers of the onion. In other words, they need to pop the hood and exercise more control over resource configuration within their cloud. While I expect this need is really news to no one, the implications on the cloud service provider, and subsequently cloud service consumer, are quite interesting to examine.

In order to provide a sense of concreteness around this discussion, I want to share the kind of discussions I have with users on a regular basis. A considerable part of my day job involves working with users implementing a cloud management device that allows them to more rapidly and consistently provision application middleware environments into an on-premise cloud. The fundamental premise of this solution is that of a patterns-based approach to middleware in the cloud. In this sense, a pattern is a representation of a particular application environment. Further, to a deployer, a pattern abstracts the inane details of the integration and configuration of the middleware supporting an application, and instead presents a simple, cloud-deployable unit. Therefore, the patterns are an abstraction of middleware resources delivered in the cloud.
While the patterns-based approach offers up a nice abstraction to the deployer, not everyone in an organization plays the role of deployer. Some within the organization are responsible for building the patterns that represent their desired middleware environments. It should come as no shock that these environments require customizations, and these customizations apply to many different layers in the software stack. Let the peeling begin!

07 December 2010

Two Weeks, Two Companies, Two Results: The Tale of SalesForce and Cisco

The stock price of Cisco, a darling of the stock market for a long time, fell 16% and contributed to a 73 point drop in the Dow index on November 11, 2010. SalesForce, on the other hand, shot into the upper atmosphere, up by 18%. Interestingly, Cisco market cap fell by $24 billion, more than the total market cap of SalesForce.

While stock swings are not as common, what made these two companies change their market value so rapidly? Investors usually peer into the future and buy or sell stocks based on the projections. Cloud computing is being recognized by investors as an engine of growth and rewarding certain companies like SalesForce.

Cisco, the sixth largest technology company by market value(1), has some products that are challenged by solutions delivered free or virtually free. One example is the consumer facing umi telepresence compared to Skype or Gtalk. Also, the next iPad is rumored to have a camera built in and there is a plethora of smartphones planned or that have with video chat capability. In this example, Cisco is going after a video conferencing market already crowded with cheap solutions. I expect for Cisco to make some good cloud start-up acquisitions to enhance their server product line capabilities in the cloud market.

03 December 2010

Google’s Office Trojan Horse

It’s no secret that Google has been eying Microsoft’s lucrative Office application franchise since the release of the premium, supported version of Google Apps a couple years ago.

Taking a page from Apple’s old playbook of using the education market to get a foot in the door, Google has scored some big wins among university and government IT buyers. They claim to have over 10 million students using Google Apps with over 3 million companies making the switch -- undoubtedly most of these are small firms, but a recent win with the State of Wyoming for over 10,000 seats shows Google triumphant in some head-to-head enterprise contests with Microsoft.

Targeting price sensitive individuals and students, who are also less attached to legacy software and used to running their lives online, was a logical opening gambit, but Google is making its next move squarely into the mainstream enterprise market with the beta release this week of their Cloud Connect for Microsoft Office.

The technology, originally acquired from DocVerse, bridges the gap between thick local applications and data, and cloud-based software and storage. Cloud Connect is a plug-in for Office 2003, 2007 and 2010 (sorry, no Mac support yet) that allows editing Office documents within the familiar confines of Word or PowerPoint, while automatically syncing them to Google’s cloud service. An interesting wrinkle is that once in the cloud, the documents inherit Google’s versioning and multi-user editing capabilities, so that several users can simultaneously edit a document, even locally within Office, without stepping on one another’s changes. (The technology is quite amazing -- those of you with a CS bent can read the full details of how they pull this off starting with the challenges, the solution and finally the optimizations).

Of course, Microsoft now has similar capabilities with Office 2010 (and Mac Office 2011) with it’s ability to save to Windows SkyDrive, but Cloud Connect certainly could drive a wedge between Office users who don’t yet have an enterprise collaboration implementation and their Microsoft account rep seeking to sell them on SharePoint of BPOS.

Many could find the hybrid approach coupling Google’s strength in online document sharing and collaboration with the familiar standby of Microsoft’s Office suite the best of both worlds. The risk for Microsoft is that once documents are in Google’s ecosystem, users could find themselves doing more and more of the content creation, editing and sharing online, rendering Office increasingly superfluous.

Join Us: http://bit.ly/joincloud

Optimizing Performance and Availability in Virtual Infrastructures

Many IT administrators have already learned the hard way that managing the performance and availability of services built on virtualization technologies can be difficult, if not impossible, at times. All too often, early adopters of virtualization have struggled with limited technology features and stability constraints, while learning new ways to effectively manage capacity requirements. Fortunately, some platforms now offer clustering solutions that are mature enough to automate the balancing of workloads across physical resources. When combined with disciplined capacity planning and sound deployment configurations, it is possible to achieve fast, scalable, and highly available IT services using virtualization.

Join Us: http://bit.ly/joincloud

Location:Inkpen Rd,,United Kingdom

The Top 5 Overlooked Reasons Why Business Belongs in the Cloud

There are plenty of “Top 5 lists” with generic reasons for why businesses should migrate into SaaS and cloud computing. Scalability, cost, mobility – they’re good reasons, sure, but we’ve heard them before: what else does cloud computing offer? If you’re thinking about moving your business into the cloud but haven’t yet, here are five reasons that are often overlooked:

1. Clients notice. Traditionally, IT has served a “backend” role in business. With the exception of email and websites, most businesses hide their IT solutions from clients, and with good reason: IT is ugly. Cloud computing changes that. Many SaaS offerings and cloud-based applications incorporate new ways of reaching clients as part of their workflow solutions. For example, Solve360, a popular online CRM, allows users to “publish” select materials from project workspaces, enabling real-time client collaboration. E-signature services allow clients to sign documents via a slick, paperless delivery model, and Helpdesk software lets clients access knowledge base forums and ticketed support in a branded, easy to use online environment. When it works, clients notice that you’re new, different, modern, and “slick.” IT itself becomes a branding exercise.

2. Smarter architecture. Amid all the fuss about cloud differentiation it’s easy to forget that, aside from being cloud-based, many cloud apps are simply designed better than their on-premise counterparts. This could be attributable to a whole host of reasons, the most prominent of which is that (good) cloud apps have been designed entirely from the ground up. Whereas most on-premise solutions have strong ancestral roots in software designed 10-20 years ago, cloud apps have been developed much more recently, meaning they’ve benefited from years of accumulated programming and business experience. Cloud apps are designed for modern businesses: most on-premise apps simply aren’t.

3. Usability. One of the great innovations of cloud-computing has been the focus put on end-users. Many legacy apps put function first and usability second (MS Access, anyone?), whereas good cloud apps don’t see a difference between the two. This key principle can’t be underestimated: software is only as powerful as the people using it. Generally speaking (and yes, there are exceptions to this) cloud-based software understands that people matter, creating a better user experience and increasing efficiency.

4. Integration. We just published a blog post blasting API integration, but it’s worth noting that at least cloud-based software makes API integration a viable and affordable workflow solution. Good luck getting anything to work well with a legacy app, especially on the cheap: compare that reality with the generous and freely available API’s that most SaaS and cloud-based vendors offer and it’s an easy sell.

5. Quality of Service. This only applies to SaaS, but it’s a powerful enough attribute that I’m listing it as an argument for all cloud-computing. In a traditional IT setting, clients have a one-time transaction with vendors, repeated every few years for product upgrades. In the world of SaaS, clients generally pay vendors month-to-month and upgrades and bug-fixes are released on a significantly ramped-up timescale. This means that: A) clients can drop out at any time, giving vendors a perpetual incentive to innovate, and: B) clients get a product that’s updated far, far more frequently than before. In addition, SaaS vendors increasingly have robust forums and user communities where support questions and feature requests are addressed quickly, effectively, and by multiple user types. This establishes a culture of support and user-driven innovation that has long been missing from on-premise software.