10 tips for speeding up Outlook

Takeaway: Does Outlook’s pokey performance have you gnashing your teeth with frustration? Try these simple fixes to give it a kick in the pants.

If you’ve used Microsoft Outlook for a while, you know that it can slow down… way down. In fact, when not looked after, Outlook can become nearly useless. Fortunately, there are several things you can do to make Outlook not only usable, but significantly improved. And none of these techniques requires a single configuration change to your Exchange server (IMAP or POP3).

Of course, some of these suggestions might seem to have a bit more “‘duh” factor than others. But you never know what level of skills you’re dealing with, so we’ll cover all the bases. In the end, you should have a much faster Outlook experience.

1: Update Windows

Many people don’t realize that with Windows updates, the updates for Microsoft Office are also “hidden.” But it’s not just the Office updates that can help speed up Outlook. Make sure you allow the updates to happen. Why? There are times when Microsoft might update Exchange. In some cases, those updates can either break or cause problems with the communication between Outlook and Exchange. Or Microsoft might issue a patch to the application to resolve a speed issue or security hole.

2: Download Complete Items

When you connect Microsoft Outlook with either IMAP or POP3, you should have Outlook set up to download the complete message (instead of just the header). If you do this, Outlook won’t have to sync with the server every time you click on a new item (as it will already be in the data file.) How you do this will depend upon which version of Outlook you’re using. But basically, you’re looking for the setting Download Complete Item Including Attachments.

3: Archive your Inbox

I can’t tell you how many times I see clients with thousands upon thousands (and in some cases tens of thousands) of emails in their Inbox. This can cause serious issues, especially when using PST files. Instead of just letting those Inbox folders grow to outrageous proportions, set up auto archiving so that your Inbox retains only a portion of those emails. I like to tell clients to keep the current and previous months’ email in the Inbox and archive everything else. When you archive, you effectively create a new data file, so Outlook doesn’t have to strain against the weight of an oversize PST or OST file

4: Use Cached Exchange Mode

If you use Cached Exchange Mode in Outlook, you effectively take the data file from the server (PST) and cache it on your local machine (OST). This can go a long way toward speeding up your Outlook experience because Outlook doesn’t have to read its data file across a network. Instead, all it has to do is read the locally stored data file. This option (obviously) is available only when connecting Outlook with an Exchange server.

5: Compact your PST File

When the Outlook PST file gets out of hand, a built-in tool can keep the size of that file under control. One of the issues is that even when you delete email from your Inbox, the size of the PST file may remain the same. If you’re using Outlook 2010, you can go to Account Settings | Data Files and select the data file to be compacted. Once you’ve selected the file click Settings | Advanced | Outlook Data File Settings and click Compact Now. Depending on the size of your data file, this process can take some time.

6: Repair your PST File

Scanpst is often my go-to tool when Outlook is acting off kilter. It will scan through your data file and look for data inconsistencies and errors. Here’s the thing about Scanpst — it’s not always the easiest tool to find. Do yourself a favor and search through your C drive to locate the Scanpst.exe file. Make note of its location (usually within the Office installation folder). But be forewarned: This tool can cause PST files to become unusable. Make sure you back up that data file before you start the repair tool. Fortunately, should Scanpst find errors, it will prompt you to make a backup before it attempts to fix the errors.

7: Cut down on the published and shared calendars

Yes, it’s easy to publish and share your calendars with others. The problem is, the more you do it, the more drag you’re putting on Outlook. The more data Outlook has to share and pull down from the Internet, the slower it will perform. Sure, it’s fine to have one or two shared calendars (and even more if you have a lightning-fast data pipe). Just know that the more data you have to push and pull, the slower your connection will be.

8: Disable RSS

By default, Microsoft Outlook will sync RSS feeds from Internet Explorer to the RSS reader in Outlook. If you have a lot of RSS feeds bookmarked in Internet Explorer, that syncing could easily bring Outlook to a crawl. Disable this feature (if you don’t use Outlook as an RSS reader) from within Outlook 2010 by going to Options | Advanced and then unchecking both options under RSS Feeds.

9: Disable add-ins

How many times have you installed some program only to find it installed something else behind your back? This can happen to Outlook as well as Internet Explorer. Sometimes those add-ins can cause major Outlook slowdowns. To find out what add-ins you have installed in Outlook 2010, go to Options | Add-ins. Select COM Add-ins from the drop-down and click the Go button. The resultant window will list all add-ins available to Outlook. Search through this list and uncheck any that might seem suspect.

10: Fix ShoreTel Windows 7 Integration

If you use the ShoreTel Communicator, you might notice some issues when trying to open and use Outlook. The problem lies in an incompatibility issue between ShoreTel and Windows 7. The fix is simple. Open up the Task Manager and look for a process called Agent.exe. Right-click that entry and click Properties |Compatibility. Choose the Run This Program In Compatibility Mode For option and then select Windows XP (Service Pack 3).

Other Tricks?

There’s no reason why anyone should have to struggle with a bogged down Outlook that will have you pulling your hair out strand by strand. These tips should help you enjoy a much speedier and reliable Outlook experience.

What other methods have you found for improving Outlook performance? Share your suggestions with fellow TechRepublic members.

Upgrade to Windows Server 2012

10 Compelling Reasons to Upgrade to Windows Server 2012

Takeaway: Windows Server 2012 is generating a significant buzz among IT pros. Deb Shinder highlights several notable enhancements and new capabilities.

We’ve had a chance to play around a bit with the release preview of Windows Server 2012. Some have been put off by the interface-formerly-known-as-Metro, but with more emphasis on Server Core and the Minimal Server Interface, the UI is unlikely to be a “make it or break it” issue for most of those who are deciding whether to upgrade. More important are the big changes and new capabilities that make Server 2012 better able to handle your network’s workloads and needs. That’s what has many IT pros excited.

Here are 10 reasons to give serious consideration to upgrading to Windows Server 2012 sooner rather than later.

1: Freedom of interface choice

A Server Core installation provides security and performance advantages, but in the past, you had to make a commitment: If you installed Server Core, you were stuck in the “dark place” with only the command line as your interface. Windows Server 2012 changes all that. Now we have choices.

The truth that Microsoft realized is that the command line is great for some tasks and the graphical interface is preferable for others. Server 2012 makes the graphic user interface a “feature” — one that can be turned on and off at will. You do it through the Remove Roles Or Features option in Server Manager.

2: Server Manager

Speaking of Server Manager (Figure A), even many of those who dislike the new tile-based interface overall have admitted that the design’s implementation in the new Server Manager is excellent.

One of the nicest things about the new Server Manager is the multi-server capabilities, which makes it easy to deploy roles and features remotely to physical and virtual servers. It’s easy to create a server group — a collection of servers that can be managed together. The remote administration improvements let you provision servers without having to make an RDP connection.

3: SMB 3.0

The Server Message Block (SMB) protocol has been significantly improved in Windows Server 2012 and Windows 8. The new version of SMB supports new file server features, such as SMB transparent failover , SMB Scale Out, SMB Multichannel, SMB Direct, SMB encryption, VSS for SMB file sharing, SMB directory leasing, and SMB PowerShell. That’s a lot of bang for the buck. It works beautifully with Hyper-V, so that VHD files and virtual machine configuration files can be hosted on SMB 3.0 shares. A SQL system database can be stored on an SMB share, as well, with improvements to performance. For more details about what’s new in SMB 3.0, see this blog post.

4: Dynamic Access Control (DAC)

Even though some say Microsoft has shifted the focus away from security in recent years, it would be more accurate to say it has shifted the focus from separate security products to a more “baked in” approach of integrating security into every part of the operating system.

Dynamic Access Control is one such example, helping IT pros create more centralized security models for access to network resources by tagging sensitive data both manually and automatically, based on factors such as the file content or the creator. Then claims based access controls can be applied. Read more about DAC in my “First Look” article over on Windowsecurity.com.

5: Storage Spaces

Storage is a hot — and complex — topic in the IT world these days. Despite the idea that we’re all going to be storing everything in the public cloud one day, that day is a long way off (and for many organizations concerned about security and reliability, it may never happen). There are myriad solutions for storing data on your network in a way that provides better utilization of storage resources, centralized management, and better scalability, along with security and reliability. Storage area networks (SANs) and network attached storage (NAS) do that, but they can be expensive and difficult to set up.

Storage Spaces is a new feature in Server 2012 that lets you use inexpensive hard drives to create a storage pool, which can then be divided into spaces that are used like physical disks. They can include hot standby drives and use redundancy methods such as 2- or 3-way mirroring or parity. You can add new disks any time, and a space can be larger than the physical capacity of the pool. When you add new drives, the space automatically uses the extra capacity. Read more about Storage Spaces in this MSDN blog post.

6: Hyper-V Replica

Virtualization is the name of the game in the server world these days, and Hyper-V is Microsoft’s answer to VMware. Although the latter had a big head start, Microsoft’s virtualization platform has been working hard at catching up, and many IT pros now believe it has surpassed its rival in many key areas. With each iteration, the Windows hypervisor gets a little better, and Hyper-V in Windows Server 2012 brings a number of new features to the table. One of the most interesting is Hyper-V Replica.

This is a replication mechanism that will be a disaster recovery godsend to SMBs that may not be able to deploy complex and costly replication solutions. It logs changes to the disks in a VM and uses compression to save on bandwidth, replicating from a primary server to a replica server. You can store multiple snapshots of a VM on the replica server and then select the one you want to use. It works with both standalone hosts and clusters in any combination (standalone to standalone, cluster to cluster, standalone to cluster or cluster to standalone). To find out more about Hyper-V replica, see this TechNet article.

7: Improvements to VDI

Windows Terminal Services has come a long way, baby, since I first met it in Windows NT TS Edition. Renamed Remote Desktop Services, it has expanded to encompass much more than the ability to RDP into the desktop of a remote machine. Microsoft offered a centralized Virtual Desktop Infrastructure (VDI) solution in Windows Server 2008 R2, but it was still a little rough around the edges. Significant improvements have been made in Server 2012.

You no longer need a dedicated GPU graphics card in the server to use RemoteFX, which vastly improves the quality of graphics over RDP. Instead, you can use a virtualized GPU on standard server hardware. USB over RDP is much better, and the Fair Share feature can manage how CPU, memory, disk space, and bandwidth are allocated among users to thwart bandwidth hogs. Read more about Server 2012 VDI and RDP improvements here.

8: DirectAccess without the hassle factor

DirectAccess was designed to be Microsoft’s “VPN replacement,” a way to create a secure connection from client to corporate network without the performance drain and with a more transparent user experience than a traditional VPN. Not only do users not have to deal with making the VPN work, but administrators get more control over the machines, with the ability to manage them even before users log in. You apply group policy using the same tools you use to manage computers physically located on the corporate network.

So why hasn’t everyone been using DirectAccess with Server 2008 R2 instead of VPNs? One big obstacle was the dependency on IPv6. Plus, it couldn’t be virtualized. Those obstacles are gone now. In Windows Server 2012, DirectAccess works with IPv4 without having to fool with conversion technologies, and the server running DirectAccess at the network edge can now be a Hyper-V virtual machine. The Server 2012 version of DA is also easier to configure, thanks to the new wizard.

9: ReFS

Despite the many advantages NTFS offers over early FAT file systems, it’s been around since 1993, and Windows aficionados have been longing for a new file system for quite some time. Way back in 2004, we were eagerly looking forward to WinFS, but Vista disappointed us by not including it. Likewise, there was speculation early on that a new file system would be introduced with Windows 7, but it didn’t happen.

Windows Server 2012 brings us our long-awaited new file system, ReFS or the Resilient File System. It supports many of the same features as NTFS, although it leaves behind some others, perhaps most notably file compression, EFS, and disk quotas. In return, ReFS gives us data verification and auto correction, and it’s designed to work with Storage Spaces to create shrinkable/expandable logical storage pools. The new file system is all about maximum scalability, supporting up to 16 exabytes in practice. (This is the theoretical maximum in the NTFS specifications, but in the real world, it’s limited to 16 terabytes.) ReFS supports a theoretical limit of 256 zetabytes (more than 270 billion terabytes). That allows for a lot of scaling.

10: Simplified Licensing

Anyone who has worked with server licenses might say the very term “simplified licensing” is an oxymoron. But Microsoft really has listened to customers who are confused and frustrated by the complexity involved in finding the right edition and figuring out what it’s really going to cost. Windows Server 2012 is offered in only four editions: Datacenter, Standard, Essentials, and Foundation. The first two are licensed per-processor plus CAL, and the latter two (for small businesses) are licensed per-server with limits on the number of user accounts (15 for Foundation and 25 for Essentials).

To View Full Article Click Here

GoDaddy Hacked, Millions of Sites Down

GoDaddy.com, the largest domain name registrar on the Web, has been taken offline, and a self-proclaimed member of the Anonymous hacktivism collective is taking responsibility.

The administrators of GoDaddy confirmed on Monday that they were suffering from technical issues, which the website TechCrunch reports to be impacting a multitude of websites and their affiliated email accounts that are hosted through the service. Although the company has not discussed the specifics yet, a self-described member of Anonymous says that he or she is responsible, a claim that has not been verified yet.

On Twitter, user @AnonymousOwn3r writes, “the attack is not coming from Anonymous coletive [sic] , the attack it’s coming only from me” and that the the action is being carried out “to test how the cyber security is safe and for more reasons that i can not talk now.”

GoDaddy has tweeted, “We’re aware of the trouble people are having with our site. We’re working on it.”

On Friday, it was reported that the White House is preparing to roll out an cyber security Executive Order that will serve as a surrogate until Congress can come to agreement on a bipartisan legislation to protect America’s computer infrastructure.

Earlier this year, GoDaddy announced that they would be supporting the Stop Online Piracy Act, or SOPA, a controversial legislation that if approved would have greatly changed the US government’s ability to monitor the Internet. The company eventually reversed their stance, but not before a massive protest resulted in many of their clients switching to other domain registrars. The boycott reportedly ended with thousands of GoDaddy’s millions of customers, including Wikipedia, cancelling their accounts.

Founded in 1997, Arizona-based GoDaddy.com is used by millions of customers worldwide, including a large number of small businesses. At 4 p.m. EST, GoDaddy tweeted, “Update: Still working on it, but we’re making progress. Some service has already been restored. Stick with us.”

Other social media accounts affiliated with Anonymous have not confirmed the validity of the alleged culprit’s claim and have largely distanced themselves from the hack. GoDaddy’s 24-hour tech support telephone line has also been inaccessible during the duration of the outage.

FAA Reconsiders Electronic Device Policy

FAA could reconsider electronic device policy for flights!

The Federal Aviation Administration is forming an industry group to study when smartphones and tables can be turned on during a flight.

Airline passengers may soon be able to use their smartphones and tablets during flights with fewer interruptions.

The Federal Aviation Administration has formed a committee to reconsider its policy on when electronic devices can be turned on during a flight.

“With so many different types of devices available, we recognize that this is an issue of consumer interest,” Transportation Secretary Ray LaHood said in a statement released today. “Safety is our highest priority, and we must set appropriate standards as we help the industry consider when passengers can use the latest technologies safely during a flight.”

And, travelers who are worried that this may mean they’ll have to listen to the person sitting next to them chat on the phone incessantly don’t need to worry — this policy review will not include allowing calls from cell phones during flights.

Airlines currently prohibit passengers from using devices during takeoff and landing in fear that transmissions would interferer with the airplane’s equipment, but the FAA’s call for comments said consumers expect a much more fluid experience. The formation of an industry group to take on the consumer desire versus safety concern issue comes after the FAA said earlier this year that it was going to research the issue.

The group — which will includes members from the mobile industry, aviation manufacturing, as well as pilot and flight attendant groups, airlines, and passenger associations — will begin soliciting comments from the public tomorrow.

The committee is interested in hearing from aircraft operators, flight crews, security, smartphone and tablet device manufacturers, and passengers.

To View Full Article Click Here

Goodbye, Hotmail. Hello, Outlook.com

Summary: Microsoft’s flagship mail service for consumers gets a new name and a “modern” Metro-style interface. Here’s how to sign up for a preview and what to expect. So long, Hotmail. It was nice to know you. Microsoft unveiled a major update to its consumer mail platform today, with a new look, a slew of new features, and a new name that is surprisingly familiar.

The “modern email” service has been in super stealth mode for several months under the codename NewMail. With its formal launch as an open-to-the-public preview, the service gets a new name: Outlook.com. I’ve been using the NewMail beta for a week now and can share some first impressions here. Outlook, of course, is the serious, business-focused mail client included with Office. Microsoft used the brand with Outlook Express, its lightweight email client in Windows XP, but dumped the name with the launch of Windows Vista in 2006. Restoring the Outlook name to Microsoft’s consumer email service accomplishes two goals. First, it dumps the Hotmail brand, which is tarnished beyond redemption, especially among technically sophisticated users who have embraced Google’s Gmail as the default standard for webmail. More importantly, it replaces the Hotmail domain with a fresh top-level domain that’s serious enough for business use. (If you have an existing Hotmail.com or Live.com address, you can continue to use it with the new Outlook interface. But new addresses in the Outlook.com domain are up for grabs. if you have a common name, I recommend that you get yourself over to Outlook.com now to claim your preferred email address while it’s still available.) The Outlook.com preview will run alongside Hotmail for now, but when the preview ends, this will be the replacement for all Hotmail and Live Mail users. With Outlook.com, Microsoft is taking dead aim at Gmail, positioning Google’s flagship service as the old and tired player that is ready for retirement. Gmail, they point out, is eight years old, and its interface and feature set aren’t exactly modern. It doesn’t play well with any social media except its own, it handles attachments in a stodgy and traditional way, and it’s not particularly elegant when it comes to managing the deluge of email we all have to deal with every day. So what’s new about NewMail—sorry, Outlook.com? And why would anyone consider switching from Gmail? The most obvious change in the web interface, of course, is the overall design, which gets the full Metro treatment.

That three-pane layout follows the familiar Outlook standard, but the typography is definitely new. It’s clean and crisp with no wasted ornamentation or clutter. It should come as no surprise that the default organization is optimized for use on touch-enabled devices. A pane on the right shows different content, depending on the context. If you’re communicating with a friend of colleague who’s in your address book or connected via a social-media service, you’ll see updates about that person on the right side, with the option to chat with them (via Messenger or Facebook chat) in that pane. In a demo, Microsoft showed off Skype integration and said it will be coming later in the preview. If you’ve selected no message, the right pane might show ads, which appear in Metro style boxes with text–an image preview appears if you hover over the ad. As part of its positioning against Google, Microsoft has taken pains to note that your messages aren’t scanned to provide context-sensitive ads, as they are with Gmail. This is a pure HTML interface, which means the functionality is consistent across different browsers and on alternative platforms. I tested NewMail on a Mac using Safari and Chrome and in both Firefox and Chrome on several Windows PCs. Everything worked as expected. I also tested the web-based interface in mobile Safari on an iPad, where it also displayed perfectly (after switching from the default mobile layout). On mobile devices, you’ll be able to use native apps. An app for iOS devices should be available immediately. Microsoft promises an Android app “soon” that will enable Exchange ActiveSync support for older Android versions. A command bar at the top of the page provides access to commands as needed. If a command isn’t available in the current context, it’s not visible on the screen.

The preview pane (a feature that’s still experimental in Gmail even after eight years) lets you read and reply to messages without leaving the main screen. Action icons that appear when you move the mouse over an item in the message list let you file, delete, or flag the message with a single click or tap.

The new Outlook has some impressive mail management smarts built in. It automatically recognizes newsletters and other recurring types of mail. A Schedule cleanup option in the message header (also available on the command bar), lets you create rules on the fly that automatically delete or file similar messages to reduce clutter. You can specify, for example, that you want to keep only the most recent message from a “daily deals” site. You can also define how many messages you want to keep from a particular sender or automatically delete/file newsletters after a set number of days.

For newsletters that don’t contain an obvious unsubscribe link, the new Outlook adds a universal unsubscribe feature at the bottom of the message. When you select this option the web service sends an unsubscribe request on your behalf and creates a message-blocking rule. One huge differentiator between old-school webmail services like Gmail is the new Unified Address Book in Outlook.com. It takes a page from Microsoft’s People hubs in Windows 8 and the Windows Phone platform to pull together your traditional address book—where you manage names and details—and combine it with social media services of which you’re a member.

The advantage, of course, is that you always have the most up-to-date contact information for friends and colleagues, assuming they update their profiles. The new Outlook does a pretty good job of combining records. If you have contacts that appear in multiple locations, you can manually link or unlink those records as needed. Supported services include anything you can link to your Microsoft account, including Facebook, Twitter, LinkedIn, and Flickr. You can import contacts from Google and Facebook if you want to keep them locally. In terms of creating and sending photos and file attachments, the new Outlook integrates exceptionally well with SkyDrive, so that you can email large attachments and photo albums, storing them on SkyDrive with well-integrated links that the recipient can access with a click. The spec sheet says single attachments can be up to 300 MB in size. If they’re stored on SkyDrive, you don’t have to worry about the message being rejected by the recipient’s mail service. And of course, the service incorporates all of the Office Web Apps, which makes the process of sharing Word documents, PowerPoint slide decks, and Excel workbooks much more seamless. On the back end, the interface for managing an email account is cleaner. You can still create aliases that you use for sites and contacts where you don’t want to share your real address. And if you just want to experiment with the new service, you can redirect your Gmail messages temporarily to the new account or sign in with an existing Hotmail or Live address. (I’ve had my Gmail account redirected to Hotmail for a year without problems.)

To View Full Article Click Here

Keywords in Search Engine Visibility

Finding the Right Keywords in Search Engine Placement

Does the task of finding good keywords for your website seem daunting? Search Engine Placement provides a variety of keyword suggestions from all competition and search volume levels, which makes it easy to pick the best keywords for your site and your goals.

When you’re researching keywords, keep these things in mind:

Low search volume, low competition (known as long tail keywords)
Long tail keywords typically contain 3-5 words in a phrase, and they are the bread and butter of keywords. Generally speaking, it’s easier to rank with these keywords, and they convert clicks to customers at a higher rate because the searcher knows specifically what they are looking for. For example, someone who sells Acrylic Aquariums might use 50 gallon bull nose aquariums.

High volume, low competition (short tail keywords)
Short tail keywords are new keywords or phrases that recently started generating search traffic. You need to be quick to catch them, though. For example, a recent short tail keyword is Susan Boyle.

High volume, high competition (wish list keywords)
Wish list keywords are the words and phrases everyone wants to use, such as real estateaquariums, or cell phones. While wish list keywords do generate a lot of traffic, the conversion levels are generally very low because the words and phrases are too generic.

Take the term aquariums, for example. People searching for that word might want to buy an aquarium, find a local aquarium, or see photos of aquariums around the world. If I sell aquariums, relying on this as my primary keyword would take a huge effort to rank and most of the traffic generated by this keyword would likely be irrelevant.

Our highly trained, courteous support staff is waiting to take your call. Whatever time it takes to assist you, that’s the time you’ll receive. We’ll resolve any issue to your complete satisfaction.

Call (888) 505-1532 to get started now or click here

10 Windows 8 Keyboard Shortcuts

10 Windows 8 Keyboard Shortcuts You Need to Remember! 

Takeaway: There are 100+ keyboard shortcuts available for Microsoft Windows 8, but there are several you’ll want to remember because you’ll use them often.

In April 2012, Greg Shultz created a free cheat sheet of 100 Windows 8 keyboard shortcuts. That download contained just about every keyboard shortcut you could imagine. Those shortcuts are still valid of course, but if you are like me you can only remember a few Windows 8 shortcuts at a time, so you want to remember the ones that will be most useful.

So, while I highly recommend that you take advantage of the free PDF download listing of 100 Windows 8 keyboard shortcuts, I also recommend that you commit the following 10 Windows 8 keyboard shortcuts to memory, because you are going to need to access these features often and, for efficiency’s sake, it’s best to have them at the ready.

Windows 8 Specific Keyboard Shortcuts

Keystroke

Function

Switch between Metro Start screen and the last accessed application
 + C Access the charms bar
 + Tab Access the Metro Taskbar
 + I Access the Settings charm
 + K Access the Devices charm
 + Q Access the Apps Search screen
 + F Access the Files Search screen
 + W Access the Settings Search screen
 + X Access the Windows Tools Menu
 + E Open Computer

To View Full Article Click Here

Cloud computing: What does it mean for IT jobs?

Takeaway: As adoption of cloud computing services takes off, some argue demand for certain IT jobs will all but disappear.

Just as manual labourers were replaced by the machines of industry in the 19th century so certain IT roles will be swept away by cloud computing services.

That’s the argument put forward by Gartner research director Gregor Petri – who believes that many roles managing IT infrastructure will all but disappear.

Manual management of IT infrastructure – for instance provisioning additional storage, servers or network capacity for a particular application – will increasingly be automated as software layers in the cloud automatically divert IT resources to where they are needed, he said.

“It is very much like industrialisation,” he said.

“Take the very old example given by Adam Smith of the pin makers who used to take a day to make four pins, then a factory is built that can make 10,000 pins in an hour.

“That is what cloud computing services is making possible: you can carry out these computing tasks on an industrial scale.”

Petri said that just as people no longer make pins manually, so in general people won’t perform tasks like monitoring an app’s storage demands and purchasing and installing new storage for it.

“The cloud computing app is already programmed in a way that allows the application to access additional storage when it is needed, as a result nobody is needed to do that anymore,”  he said.

“Cloud is allowing the industrialisation of IT, that is why to some people it is very scary.”

While Petri believes that traditional IT infrastructure management roles will become all but defunct, allowing IT systems to be run with fewer people, he said it doesn’t necessarily mean the individuals who carried out those roles will find themselves out of the job.

Instead he sees new roles being created that use that individuals’ technical skills to add value to the business, for instance working with managers in other departments to make company IT systems better fit the needs of staff or customers. “People in those roles need to flexible in the idea of what their role is,” he said.

The changing landscape of computing – for instance real-time big data analytics or the provision of scalable cloud services to always connected mobile computers – will also create new roles, he said.

When will it happen?

While today’s cloud platforms are already automatically provision these resources today, Petri said that the effect of this industrialisation of computing will not be felt until more applications are shifted to the cloud.

That could be some time off. Although adoption of cloud services is growing rapidly – Gartner predicts that the market for cloud compute services will grow 48.7 per cent in 2012 to $5bn, up from $3.4 billion in 2011 – spend on cloud computing services is still only a fraction of global IT spend. However, by 2020 the majority of organisations will rely on the cloud for more than half of their IT services, according to Gartner’s 2011 CIO Agenda Survey.

Will jobs really disappear?

Not everyone is convinced that cloud computing services will have such a profound effect on the IT jobs landscape.

Some believe that while roles will likely transition from in-house IT teams to cloud providers as companies consume more cloud services, the roles and demand for skills will remain.

As a TechRepublic reader who works for a large cloud provider pointed out: “I still deal with the daily hands on from thousands of customers / clients, some pretty huge ones at that. Between dealing with their AD, LDAP, Windows / Linux deployments, configuration and code issues, I can say that server administrators will still be needed in fact more than ever.”

Other readers have pointed out that IT roles tend to endure far longer than expected and certain technical skills remain in demand. Old programming languages never die, as another reader points out:”Back in 1977 I attended a COBOL Summer class in my university. The first thing the instructor told us was that it was dead language, as new technologies were pushing it to extinction… Guess what, early this morning I reviewed (part of my duties as a DBA) some SQL embedded in a COBOL program to run in the z196 Mainframe”.

To View Full Article Click Here

Pros and Cons of Web Hosting for Data Storage

Web hosting for data storage can offer several advantages.

Web hosting for online storage was once a high-priced option. Now, with the cost of web hosting dropping, many companies are offering what’s come to be called cloud storage: Web-based third-party data storage. Your data is stored on on a network of servers that are owned and maintained by other companies and housed on their sites rather than at your home or place of business.

Backups

Lost laptops and damaged hard drives are less of a disaster if you make regular backups, and having a Web storage system in place can make this easier. You can even automate your backups so that copies of your data are made regularly without you needing to do anything.

Data Loss

While the backup services they offer can be very useful, cloud hosting firms can lose data just like any other company. Firms can suffer fires, thefts, server malfunctions and other incidents that could result in data loss. They can also go out of business, potentially taking your valuable data with them if you don’t have any other copies.

Ease Of Access

Ease of access is one of the main selling points of cloud storage. You can access your data from any computer with an Internet connection, making it easier to work off-site or from multiple different locations. People working on the same project from different physical sites can easily access and share information.

Security Risks

The downside of easier access for you or your staff is that unauthorized access is potentially easier, too. Your information may be less secure than it would be if you restricted it to physical media. Unless you encrypt your data so that it can only be read by authorized personnel, your data is only as secure as your hosting company’s systems. This is a particular concern if you want to store sensitive information such as your clients’ personal details.

Space And Resources

Storage space is generally less of an issue for private individuals, but if your organization handles a lot of data, storage and servers can take up a lot of space. If you have a great deal of storage equipment it can require dedicated climate-controlled rooms. Factors such as insulation and cooling systems mean these may be costly to construct. Increased power consumption can also mean increased costs. There’s also the headache of maintenance and technical support; you may have to take on extra staff, or you might find that your current IT staff have less time available for other duties. If your needs vary, with heavier demand at some times than at others, the lack of flexibility can mean waste. Using online storage and third-party servers allows you to save valuable space and leave the maintenance to a dedicated firm’s (hopefully) expert staff. Cloud storage can also help keep costs down because most companies only charge you for what you use in terms of space and bandwidth.

Cloud Storage Costs

Inevitably your Web-based storage space comes with a price tag—one that gets larger as your needs increase. The more data you store and the more bandwidth you use to upload and download your data from the hosting company’s servers, the more you’ll have to pay. If your storage needs are fairly predictable, it may be more cost-effective to set up your own on-site storage systems.

Call (888) 505-1532 to get started now