These days Xiaomi, this hyped-up China phone maker, held the first on the net sales with the new Xiaomi Mi3 Smartphone along with the smart MITV, and the two devices had sold outs in merely over 60 seconds or so.

Your Badge, Please: Why 2014 Will See A War Over Professional Identity



ReadWritePredict is a look ahead at the technology trends and companies that will shape the coming year.



Pour your coffee, sit down, and log in: It’s the routine of hundreds of millions of knowledge workers.


As businesses add a crazy quilt of online services to the tools we use, those logins keep piling up. Information-technology managers—when they don't just throw up their hands—have long dreamed of a nirvana called single-sign-on, where one login rules all. And increasingly, those login credentials will live up in the cloud, managed by some Web giant.


Three companies are on a collision course, jostling for control of the keys to our professional identity. They are Google, Microsoft, and LinkedIn.


The Google Way


Google Apps is the primary way Google insinuates itself into a business. For $5 per user per month, Google offers email, storage, collaborative documents, and more. All of that comes with a Google Account—the key to other Google services, as well as apps that have integrated with Google. (Google recently eliminated some roadblocks to using Google Apps as a login.)



Sundar Pichai oversees Google Apps and Android—a potent combination. Sundar Pichai oversees Google Apps and Android—a potent combination.



The tradeoff with going Google is that some of Google’s initiatives—for example, pushing Google+, its don't-call-it-a-social-network thing that lets you connect and share with friends online—may clash with the sensibilities of corporations.


Yet the integration of Google accounts with Android apps may appeal to those looking to mobilize their workforces. And the price is hard to argue with. The missing piece here is for Google to find a way to make company-specific versions of Google+ that are private and secure for internal information sharing and video chat—and to deal with the frustrations experienced by users who have both personal and corporate Google accounts.


Microsoft’s Cloud Is Rising


While Google may have a head start in Web-based productivity apps, Microsoft is not resting on its big workplace franchise in Exchange email and Office apps. In fact, it may have found a clever, little-noticed way to bring Office users into a vast online directory.



Microsoft’s Satya Nadella has a clear cloud vision. Microsoft’s Satya Nadella has a clear cloud vision.



In the 1990s, Microsoft ventured into the dangerous waters of creating a universal Web login when it unveiled Hailstorm, later known as Passport. After a torrent of brickbats from privacy and antitrust activists, Microsoft retreated—but Passport quietly lived on, eventually becoming the Microsoft Account, the service people use to log in to Windows today.


There's a little-known enterprise version of the Microsoft Account called Windows Azure Active Directory. It is a version of the directory tools Microsoft has long offered to businesses—but up in the cloud, in one big database of employee logins. If you use Office 365, you have an Azure Active Directory account.


At a press event in October, Microsoft’s cloud chief, Satya Nadella, one of the internal candidates to become the company’s next CEO, unveiled a vision for how Microsoft could bring together its powerful desktop-software franchise and its budding cloud services.


"Every time someone signs up for Office 365, they've populated Azure Active Directory,” Nadella noted. Microsoft is also encouraging It managers to sync their old-school “on-premises” directory servers with Azure, adding to the accounts Microsoft tracks.


Nadella mused about the "notion of having an enterprise directory that's fully programmable and accessible through interfaces” to software developers. In other words, that Microsoft login might not just be the tools to your work email and Office apps online—it might become the way you access hundreds of third-party apps, especially on devices running Windows 8 or Windows Phone.


Don't count out Yammer, either. While the online-collaboration tool has mostly been quiet as a subsidiary of Microsoft, which acquired it last year, it had been preparing a big push to lure app developers to use its accounts as logins. Microsoft could revive that strategy, either independently or as an arm of the Azure push.


LinkedIn’s New Connections


If you’re still thinking of LinkedIn as a place to hunt for jobs, catch up: The professional network has been refashioning itself as the hub of its users' daily work lives.


Right now, LinkedIn users post updates publicly. But LinkedIn CEO Jeff Weiner has been talking about how his employees have access to a special version of the site where they can share updates and collaborate internally—an unreleased competitor to Microsoft’s Yammer.



LinkedIn CEO Jeff Weiner sees your connections. LinkedIn CEO Jeff Weiner sees your connections.



In 2013, LinkedIn also rolled out a new contact-management feature and a host of mobile apps—including a controversial one, Intro, which essentially inserts LinkedIn as a middleman for your email, adding details about your correspondents to every message.


LinkedIn also owns Slideshare, a tool for sharing business presentations, and runs Pulse, a Web and mobile app that pulls together the news headlines your colleagues and peers are reading.


And for some time, though it's far less known than, say, Facebook's tools for logging into apps, it has offered a platform that lets developers log users in using their LinkedIn profiles.


Put it all together, and LinkedIn has many of the same things Microsoft and Google do: a professional identity that's portable on the Web, and tools for email, contacts, and collaboration around information.


LinkedIn has one key advantage over Microsoft and Google: It is organized around the modern way we work, where not everyone has the same ending to their email address. The network of contractors, vendors, and partners who swirl through our daily lives may not be in the same single-sign-on directory. But odds are they're on LinkedIn.


Battles Inside And Out


Microsoft and Google, of course, have been in open warfare for some time, and crow about stealing each other's customers. No surprise there. But it will be interesting to watch if Microsoft insists on keeping Windows Azure Active Directory on Windows Devices—or makes a play to get in the world of Android. That will likely be a test of whether Microsoft’s new Devices group, bolstered by the addition of Nokia’s handset business, has the upper hand—or if Nadella’s cloud army will be triumphant.


Google, too, has its internal battles to fight. Google Accounts are where Google+, Google Apps, and Android all intersect. Google CEO Larry Page has tried to cut down on internecine warfare by pushing out Andy Rubin as Android chief and installing Sundar Pichai, who now oversees Google Apps and Android. And Google+ chief Vic Gundotra has been persuasive in pushing his sharing tools as a “social layer" throughout Google. But making Google's services work smoothly inside and outside of corporations may require some serious architectural changes—and not all Googlers will be happy about it.


LinkedIn has the least to lose and the most upside here. And one of its virtues may be that it is neither Microsoft nor Google. Already, it has made a very interesting friend in Apple, which welcomed LinkedIn inside its Mac OS X operating system. Though we haven’t seen many examples of apps taking advantage of this, it's an interesting beachhead that few have taken notice of.


Others Waiting In The Wings


Could others make a play for this market?


Twitter is an obvious contender. It is popular as a login option, particularly with apps that let users generate or share content, like news readers. And many enterprise apps are integrated with Twitter. (Take, for example, the content-management system which published this story on the Web and on Twitter simultaneously.) For many users—particularly journalists, marketers, and celebrities—Twitter has also become their public-facing persona online.


Yet it’s hard to reconcile Twitter's town-square feel with the cloistered campuses of corporations. Twitter may end up always being a megaphone, not an intercom.


Salesforce, too, is worth watching. In 2012, the software maker launched Salesforce Identity, and it opened up the service to app developers a couple of months ago. It's less of a source of identity like a Google or Microsoft account, and more of a bridge to various online accounts, but it could grow to become something more.


What’s clear is that becoming the way people log in to work applications is increasingly valuable, as more and more of our tasks shift completely online. Those who check our IDs aren’t just gatekeepers: They hold the keys to hundreds of millions of users who could become customers for the next great enterprise app. As such, the war over professional identity matters to employees, managers, and app developers. It will be an epic contest.


Photo by Flickr user wonderferret






from ReadWrite http://readwrite.com/2013/12/20/professional-identity-2014-predictions

via

PayPal Now Officially Owns Braintree




PayPal has completed its $800 million acquisition of Braintree, a payments provider known for its platform for mobile app developers and its Venmo service, which lets consumers send each other payments through an app.


Now that the deal is done, expect PayPal to aggressively court developers to integrate its payments service in their apps. It recently acquired Stackmob, which provides backend services to developers, to make its offerings more appealing.






from ReadWrite http://readwrite.com/2013/12/19/paypal-braintree-acquisition-closed

via

Mailbox Now Supports More Than Gmail




Today Mailbox, the email application that claims to make it easy to get to "Inbox Zero," introduced support for Yahoo Mail, iCloud, me.com and mac.com email accounts. According to the company, Mailbox gets more requests for Yahoo Mail and iCloud support than any other feature.


Before today's update, Mailbox only supported Gmail accounts.






from ReadWrite http://readwrite.com/2013/12/17/mailbox-yahoo-mail-icloud

via

Facebook Now Testing Intrusive Autoplay Video Ads In Your Newsfeed




Facebook's autoplay video advertisements are rolling out this week on its website and mobile apps. When the company first introduced autoplay video in September, it was all but inevitable that Facebook would begin to sell video advertising.


And now it has.


Facebook is beginning with trailers for the movie Divergent . The company says it's a limited test, but it's almost certainly just a matter of time until Facebook opens up autoplay video to a variety of other advertisers.



You might have already noticed videos shared by friends or "verified" pages you follow already automatically playing in your newsfeed. Facebook's new video ads will work the same way. When you scroll up to an ad, it will begin to play without sound. If you click or tap on the ad, the video will play with sound. When it's done, it'll display a carousel of two additional videos from the same marketer, just in case you'd like to watch more ads.


Facebook says that for now, only a small number of people will start seeing ads in their timelines. There's no way to stop or prevent an autoplaying video, and Facebook suggests that you simply scroll past it if you don't want to watch it.


One major concern for mobile users was the additional data required to show autoplay advertisements. Facebook claims it pre-downloads video over Wi-Fi connections so they don't eat up your mobile data allocation.


Here's a Facebook video explaining its videos:











from ReadWrite http://readwrite.com/2013/12/17/facebook-autoplay-video-ads

via

Dell's Business Model Shifts To The Cloud In Pact With Dropbox

In a move to make itself more relevant to companies hungry for drag-and-drop online storage, Dell announced new plans that will bring Dropbox to several of Dell’s products and services.


Dell’s expansion into cloud-based storage says a lot about its future strategy. Following its 2011 break with EMC as its storage provider, Dell quickly aligned itself with many cloud-based storage providers and application vendors.


This week, Dell Ventures—the company’s venture capital arm—announced a fresh round of $300 million to invest in strategic startups to help build out Dell’s data centers, storage and mobile products. The round follows $60 million that Dell invested last year for storage-specific companies to help it build out its data center business. Popular personal cloud startup Dropbox is expected to gain a portion of those funds.


The venture capital money follows Dell’s announcements last week that it’s salespeople will be offering Dropbox for Business to new and existing customers. Dell said it will also pre-install Dropbox’s online storage service (complete with Dell’s own brand of data protection software) on its consumer and business tablets.


Dropbox boasts that it is used by more than 4 million businesses and upwards of 1 billion files uploaded every 24 hours. That’s a small drop in the bucket compared to the 1 exabyte of data, analysts suggest are stored in the cloud. That’s a key market Dell is hoping to be a part of.


To get there, Dell will promote the use of Dropbox and provide its customers’ IT departments with software support to make sure Dropbox meets compliance and regulatory requirements. Dell also wants to avoid any data meltdowns like the ones Dropbox had earlier this year.


Dell’s other notable cloud storage partnerships include its 14-year run with Red Hat; OpenStack cloud and open source application infrastructure provider Mirantis and solid-state storage maker Skyera.


As more businesses move simple storage to cloud-based systems, providers like Dropbox are sure to be in high demand.


Photo courtesy of Flickr user mekuria getinet






from ReadWrite http://readwrite.com/2013/12/17/dell-dropbox-pact-perks-up-business-argument-for-online-storage

via

Facebook Rolls Out Donate Button, And Keeps Your Payment Information




Today Facebook announced "Donate", a new feature that lets people give money directly through Facebook by clicking on a "Donate Now" button next to posts from nonprofit organizations. Facebook that 100 percent of your donation made through Facebook will go directly to the charity of your choice.


Of course, as Mike Isaac at AllThingsD points out, transactions will require a credit card, payment information that Facebook will keep on file and thus will have available to help tempt users into future non-charitable purchases. Assuming, of course, that users want to trust Facebook with their credit card information in the first place.






from ReadWrite http://readwrite.com/2013/12/16/facebook-donate-button

via

Google Could Indeed Homebrew Its Server Chips—Just Not Soon

Rumors around the Googleplex suggest the search engine is considering making its own server chips. The news may have shaken but not stirred semiconductor legends Intel and AMD.


A Bloomberg report cites sources who suggest Google will ditch “Intel Inside” for its millions of data-center servers in favor of chip designs developed by ARM. Those low-power chips are most frequently used in smartphones, tablets and other mobile devices. ARM designs typically do not find their way into server processors.


Like other operators of massive data centers, Google is actively evaluating its opportunities to deploy ARM-based servers in lieu of the x86-based units now in place, to save upfront and ongoing costs. There are many silicon vendors actively pursuing the ARM-server opportunity, including Marvell, Calxeda and AMD.


So Google doesn't really need to design its own processors to get those benefits. It does already design its own server motherboards for Intel processors; since those are designed for specific tasks, the servers don't need a lot of the options featured on general-purpose boards.


The economics of chip design are very different. The upfront costs are way higher, and the production economies really favor higher volumes than a single company, even one with Google’s scale, can generate, says Nathan Brookwood, a research Fellow at semiconductor consulting firm, Insight 64.


“The only possible rationale for a company like Google to do its own [processors] would be if it had some proprietary algorithms that it wanted to implement in silicon, rather than in software,” Brookwood told ReadWrite. “Even so, the performance and/or power benefits would have to be very compelling to justify a move like the one currently rumored.”


In short, are there ARM servers in Google’s future? Most likely yes, admits Brookwood. Will Google design the chips in those servers? Most likely no.


Intel, AMD and Google did not make public statements supporting the Bloomberg report.


Photo courtesy of Flickr user Robbie1, CC 2.0






from ReadWrite http://readwrite.com/2013/12/13/google-arm-semiconductors

via

Software May Be Eating The World, But Open Source Software Is Eating Itself

Software may be eating the world, as Marc Andreessen posits, but open-source software seems to be eating itself. And at a far faster clip. While the software world has grown used to products and their vendors dominating for long stretches (think: Microsoft in operating systems and Oracle in databases), the new world of open source is moving at an accelerated, Darwinian pace, leaving no project to rest on its laurels.


In this fast-changing open source world, how should enterprises decide where to invest?


Open Source Picks Up The Pace


Though Dirk Riehle's analysis of the total growth in open source projects is a few years old, if anything the trend he plots has accelerated:



Today much of the interesting code in technology’s most important markets—Big Data, cloud, mobile—is open source. With more activity focused on areas like Hadoop or OpenStack, we should expect the pace and volume of open code creation to increase.


Which may be good or bad.


No Rest For The Open Source Developer


Take, for example, the configuration management market. Redmonk’s Stephen O’Grady sifts a number of data sources that measure the popularity of Chef, Puppet, Ansible and Salt, the latter two being very new to the market, yet demonstrating considerable community enthusiasm and adoption.


This prompts O’Grady to speculate that “Where it once was reasonable to conclude that the configuration management space would evolve in similar fashion to the open source relational database market—i.e. with two dominant projects—that future is now in question.”


O’Grady goes on to suggest:


The most interesting conclusion to be taken from this brief look at a variety of community data sources, however, may well be the relevance of both Ansible and Salt. That these projects appear to have viable prospects in front of them speaks to the demand for solutions in the area, as well as the strong influence of personal preferences—e.g. the affinity for Salt amongst Python developers.

Actually, I’d argue that the most interesting conclusion is that no open-source project has guaranteed longevity. Puppet came out in 2005 and is still making headway against entrenched proprietary incumbents, yet now it has to fight off Chef (which came out four years later), Ansible (last two years) and Salt (last two years).


Yes, incumbents in any important market, proprietary or otherwise, will always have new market entrants nibbling at their heels. But in open source, the competition doesn’t wait for billion-dollar markets to form before it launches attacks. The rise of Salt and Ansible in a market already well-served by Chef and Puppet is a testament to this.


The Community Giveth, And The Community Taketh Away


You will find this same dynamic in content management (Drupal vs. Joomla vs. Alfresco vs. Wordpress vs. countless other CMSes), cloud (Eucalyptus vs. OpenStack vs. CloudStack vs. CloudFoundry vs. OpenShift vs. many others), web servers and databases, both relational and NoSQL.


The ranks of open-source databases swell with new entrants almost daily, as can be seen on the DB-Engines database tracking service. Perhaps most interesting is the open-source relational database market. Up until recently, MySQL dominated that market. Postgres was a viable runner up to MySQL, but it was a very distant second.


Today things are in motion. Or commotion. Largely due to Oracle’s alleged fumbling of the MySQL community, Postgres is on a tear, booming even with the hipster crowd that welcomed MySQL. But so is MariaDB. Though still a comparative gnat, leading Linux distributions like Red Hat’s Fedora and Ubuntu have embraced MariaDB, as has Google, replacing MySQL.


Perhaps, as O’Grady implies, this comes down to developer preferences. If developers rule, then little impedes them from switching to new projects that may fit their needs better, throwing a given market into disarray. If this is correct, it would explain why open source resists long term monopolies:


It’s hard to keep developers happy.


Building A Community-Friendly Business


What does this mean for enterprises that are looking to make long-term investments on a given open-source project? An easy, if unsatisfying, answer is that enterprises should contribute to the projects they care about, ensuring their sustainability as well as giving the enterprise the ability to support themselves should the project dwindle.


But most enterprises don’t want to have to code the winner themselves.


Instead they should look for popular projects that are good technical fits for their enterprise requirements and that have strong communities. Popularity can be fleeting if a project grows callous to its community. One of the primary reasons Linux has endured so long at the top of the operating system heap is that it has been so accommodating to community influence and requirements.


Unfortunately, there’s no One True Way to measure vitality in an open source community. Some successful projects, like OpenStack, lean on a strong foundation. Others, like Linux, depend upon a strong individual and her lieutenants.


But all successful open-source projects that maintain their lead innovate quickly, with regular releases every few months. While a fast-moving project may be more difficult for enterprises to support, it may also be a key indication that the project will remain relevant.


How else should enterprises hedge against the risk of obsolescence of an open-source project?


Lede image courtesy of Shutterstock .






from ReadWrite http://readwrite.com/2013/12/12/open-source-innovation

via

Readers: Map Out The Future Of ReadWrite With Our Survey

We at ReadWrite are constantly searching for better ways to serve our audience, including covering the stories you would most like to see, in a way that resonates with you. So we're conducting a reader survey to learn more about our audience. Alongside your comments, your searches, and the articles you choose to read, this survey provides another set of valuable data for us to ingest.


Please take two minutes to answer a few questions, and help us create more of what you want.



If you have problems with the embedded survey, click here to take it.






from ReadWrite http://readwrite.com/2013/12/09/readwrite-reader-survey

via

2M Passwords For Facebook, Twitter And Others Stolen In Massive Data Breach




Have a Google, Facebook, Twitter, LinkedIn, or Yahoo account? If so, you might want to change your password, stat.


According to cybersecurity firm Trustwave, hackers using a nasty piece of work called the Pony Botnet Controller have stolen usernames and passwords for nearly two million accounts. The firm determined that a malicious keylogger installed on users’ computers was to blame.


Researchers at Trustwave said this massive data breach has been going on for a month, but they only discovered and publicized their findings Tuesday, CNN reports. Facebook, LinkedIn and Twitter told CNN that they've notified affected users and reset their passwords; Google declined to comment, and Yahoo didn't provide a response to CNN.






from ReadWrite http://readwrite.com/2013/12/04/passwords-hacked-stolen-pony-botnet

via

Plugging Energy Management Into The Connected Home



This is a post in the ReadWriteHome series, which explores the implications of living in connected homes.



The ideal connected home can do something that, alas, many of us humans cannot: Turn off the lights behind us when we leave the room.


Mundane tasks such as turning off the TV and making sure we didn't leave the stove running when we leave the house are the sorts of things that drive a fair bit of consumer desire for the connected home. Some of this is to relieve stress, but there's also real money to be saved. Trim a few pennies in electricity consumption here and there, and it can add up to real savings when the monthly phone bill comes due.


Who Watches The Watchers?


There is good news and bad news when it comes to energy management in the connected home.


The good news first: There are many tools and devices out there that can give you real-time monitoring and control of energy use in your home. The bad news is, there are so many of these companies out there, it's hard to get a sense of who's going to be around for the duration in the Wild-West atmosphere of environmental technology.


It's not a problem to be ignored. In the course of researching this article, I discovered a number of promising devices and services, only to find later that the vendor had shut down or disappeared following an acquisition by a larger company.


That's a big problem for energy monitoring devices connected to cloud-based services, since you don't really want the device maker or the service provider vanishing without notice. So when choosing an energy monitoring tool, it's a good idea to do a little legwork and see who's been around a while and will have the endurance to keep going. There are no guarantees—companies get bought and sold all of the time—but continuity is not a bad feature to keep in mind.


Energy-monitoring systems come in three primary classifications: outlet monitors, whole-home monitors and fully integrated home automation systems.


Plug Into Outlet Monitoring


Power monitoring at the outlet level is a pretty straightforward affair: plug the monitor into the wall, then plug whatever electrical device you want to use into the monitor's outlet.



The Kill A Watt monitors at the outlet level. The Kill A Watt monitors at the outlet level.



In North America, one long-standing device in this category is the P3 Kill A Watt, which measures power consumption by the kilowatt-hour, from which you can then figure out the cost of use for the device.


Homeowners in the U.K. or E.U. can get similar monitoring service from the Wattson Classic device, which uses sensor clips to monitor energy usage for outlets.


Watching The Whole House


If you want something a little more comprehensive, there are a number of devices on the market that can monitor power use for an entire house or apartment.



The Ambient Energy Orb gives colorful signs when energy use is pricey. The Ambient Energy Orb gives colorful signs when energy use is pricey.



One of the prettier tools in this category is the Ambient Energy Orb, which features a real-time digital display of power use, mounted in a lighted ball that glows green when electricity use and pricing is low, and red when demand and pricing is high, prompting users to start turning off lights (or TVs, or computers).


Canadian vendor Power2Save has a whole line of wireless energy monitors that attach to your home's circuit breaker panel and deliver comprehensive consumption to North American users. Some of these devices, like the E2, can be monitored with a PC or Mac, or (as with the Home Hub) with an iPhone or Android app.


All-In-One Management


Monitoring is one thing: but fully integrated control systems are something else entirely. With some careful programming, you can manage devices on a hyper-efficient schedule or control them even when you're away from home.


Insteon has one such home-automation system, if you want to go that route. Though their systems have security and safety monitoring uses, Insteon has a strong focus on managing power and home devices.


Belkin's WeMo products are not quite as comprehensive in the feature department, but the systems are getting high marks from users for their ease of use.


All of these systems can deliver useful information to the consumer about their energy use. Not included, but still available to some power users: smart meters from the electric company that can handle the task of energy monitoring. Either way, home energy monitoring is coming, hopefully with savings to your wallet.


Lead image courtesy of Shutterstock






from ReadWrite http://readwrite.com/2013/12/02/connected-home-energy-management

via

Think Email Is Dead Outside Of Work?

A 2012 Harvard Business School study, “E-Mail: Not Dead, Evolving,” found “communication between individuals—the original intent of e-mail—isn’t even listed in the top five activities” of how we use email today.


I have worked in the world of technology since 1982 and even worked as the vice president of an email services company. I tend to lean pretty heavily on email in my work world, but I have noticed how its use is changing in my personal life.


My most technologically literate friend, Stephen, and I often communicate by Twitter. Some friends who used to send me emails now mostly communicate with me by comments on my Facebook feed. Some even presume that I might stoop to reading Facebook email which I listed as one of the ten things that the tech industry should fix.


I have found that my thirty-something friends and family prefer to text me on my smart-phone. I am okay with that since I found the MightyText app that lets me send and receive text messages from my Google Chrome browser and on my tablet.


My personal reality seemed to be shaping up to a handful of people outside of work who still communicate with me by email. Even some of them are only responding to emails that I send. The golden age of personal email seemed to be receding into the mists of time.


It is different in the business world, where stats show that 48% of consumers prefer email as the communication with their brands. That explains why I have spent the last week trying to decide between Constant Contact and MailChimp as email marketing platforms.


Can Email Be The Great Equalizer?


About six months ago, two things happened to change the dynamic that emails are dying as a form of communication in my personal life. First I got elected to the board of directors of our homeowners association (HOA). Second, our minister decided that communication between the committees led by the elders of the church would go go electronic.


One of the reasons I got elected to the HOA board was the hope that I would create an online calendar and perhaps establish email communication between the board and homeowners. I did end up doing all of that but it turned out to be the easy part of the volunteer job.


At the church, I was already in charge of our website, and the communications committee.


Together, these two events gave me a completely new perspective and perhaps a hope that email for communication between people outside of business still has some life even if it will not be as glamorous as the earlier days of email.


Most of us in the technology world work in environments where we share files on a regular basis. At WideOpen Networks, my day job, we use Skype, Dropbox, and Highrise to share a lot of Pages files. When I am writing an article for ReadWrite, I often write the article in Google Docs and I can usually just attach the file directly through Trello, the content management solution which we use or upload a rich text format (RTF) document to a Trello card.


In both work cases, I am dealing with folks who understand files and things like Dropbox, Box, Google Drive, and Skydrive. If there is a problem, it usually can easily be solved by sending someone a RTF document.


Life is not nearly simple when you start trying to share files with people of varying ages and technology skills.


The Challenges Of Email And File Sharing


When I started sending my files to other church elders, I thought the easiest and most foolproof thing would be to share a document and send the sharing notice with the content of the report pasted into the email. To be blunt, that was a disaster. Some complained that they could not even open my email. It left me wondering how that could be.


The mystery started to clear when it occurred to me that a lot of people have become occasional email users and they are accessing their email on everything from a browser to get to ISP-provided email to iPads and smartphones with a variety of email clients—some of which an email snob like me considers pretty shaky.


One of my preferred technologies is IMAP email and preferably IMAP on a server in the cloud that I manage or one that is managed by people who actually know what they are doing and are focused on getting my email from me to the people I want to contact. While I use Gmail (IMAP version of course) for personal email, it is not my choice for business email.


I am not a big fan of webmail portals, which I considered are at best a necessary evil when a hotel’s Internet service blocks a port and makes it impossible to use client-based email.


When I started looking at the email providers used by some of the people with whom I was trying to communicate, I knew that attachments were likely going to be problems.


One Man's Battle With Attachments


Recently, I found out just how much of a problem attachments can be even in a very small group. At our most recent HOA board meeting, I ended up being the secretary when Anne, our very competent secretary, had to take one of her children to the doctor.


I managed to scribble down some notes and took Anne’s advice and typed them up that same evening while things were still fresh in my mind. I actually tried typing them up in Pages 5, since I was writing my Why Less Might Be More In Pages 5 article. I had some trouble getting the bullet numbering right so I moved it to Google Docs and actually sent her a Word docx file. There were a few details that needed to be added a little later before the minutes were finalized.


A few days later she sent out the completed minutes. I had no trouble viewing the file she sent but I did notice that somehow the file extension had been stripped. I added a .RTF to it and opened it file in Word, but strangely it would not open in Nisus Writer Express or Page 5. I chalked that up to stuff that just happens in the computer world.


We were already having more than a little trouble getting everyone’s approval on the emailed minutes attachment before printing and being mailed out. When we did not hear from the other two board members regarding the attachment, I sent an email to Anne and said that since she was out of town I would print the minutes and take them to the other board members. I did that at noon the next day.


At the first board member’s house, I was told they had two computers and one computer seemed to be eating all the emails before the other one could read them. Following my rule of never getting involved in solving a technology problem unless the person is a blood relative, I did not bring up the subject that their email was likely POP and the first computer was likely removing the email from the server. I handed them the printed copy and just made sure the board member was happy with it.


At the second and last house, the wife of the HOA’s president took the printed copy and said she would deliver them to her husband when they met for lunch later that afternoon.


I did not think anything more of our problems until the president of the HOA showed up at my door that same Saturday afternoon. While he had gotten the printed copy of the file that I delivered, he wanted to know why he could not open the attachment sent by our secretary. He had tried unsuccessfully on his Android tablet and Android smartphone.


It took me a minute to remember the missing file extension on the attachment and a lot longer to find a free app, OfficeSuite, to install on his smartphone. Just to be perverse, before I forward him the file again, I added a .docx extension to the original file the secretary had sent. I tested the file on Mobile Office 365 on my smartphone before opening it without any problem on his smartphone using OfficeSuite. He left happy that he could read the minutes. I did not spoil the good feeling by telling him a program could easily strip the extension again the next time the minutes are sent.


Lessons Learned


All of this is far more complex than it needs to be. Holding classes on how to collaborate with others using electronic devices is beyond what I want to tackle in an area that I love but which gets most of its time sensitive communications from hand-lettered bed sheets on posts at the intersection of the main highways instead of through Twitter.


It turn out that email is the solution. You just have to keep it very simple. If you have to share something with people with whom you do not work, do not do attachments. Just copy the text of your report and paste it as plain text into an email.


Do not even dream of trying to get a diverse group of people using Google Drive or Dropbox, just be smart and revert to the simplest email that you can use. Follow my recommendations and use plain text email and cross your fingers. At our church, which is a larger group, I just quit doing reports. It makes life a lot easier.


Image courtesy of Shutterstock.






from ReadWrite http://readwrite.com/2013/11/29/think-email-is-dead-outside-of-work

via

Tablets: The Tech Guru Gift Guide


ReadWrite Shop is an occasional series about the intersection of technology and commerce.


Brace yourself: Tech enthusiasts across the land are about to get swamped by friends and family members begging for help on gadget gifts. Yes, that means you.


In the past, choosing a tablet often hinged on tech specs and app selection. But the gap in hardware performance has narrowed quite a bit, and app stores for Apple and Android devices are much more alike than they used to be. (Most of the top iPad apps are available on Android at this point.) So how to choose?


To steer your relatives through these shoals, turn to our handy tablet gift-guide flowchart and revel in the admiration and awe you'll receive in return.





Lead image via Flickr user ebayink, CC 2.0






from ReadWrite http://readwrite.com/2013/11/28/tablet-buying-guide

via

Terremark Gets Surgically Removed From HealthCare.gov

The flailing convulsions that make up the launch and subsequent recovery of the Affordable Care Act's HealthCare.gov web site is still managing to inflect damage on vendors who had a hand in setting it up in the first place.


Next up: Verizon Terremark, which was the web-hosting provider for the online marketplace, has been given the boot by the Department of Health and Human Services. HHS opted not to renew its contract with Terremark, and instead awarded the winning bid to Hewlett-Packard, the Wall Street Journal reported.


HP's Enterprise Services group put in the winning $38 million bid to start taking over the web hosting duties this summer.



Anyone following the HealthCare.gov debacle will not be terribly surprised by Terremark's fallen status. Health and Human Services Secretary Kathleen Sebelius threw Terremark under the bus in her Oct. 29 testimony to Congress when asked about a recent failure on the site. Sebelius pinned the blame squarely on Terremark.


A couple of things leap out at me about this move. The first is quite cynical: I hope HP can survive their encounter with HealthCare.gov. The second is more of an observations on mixing technology with politics: it rarely works.


The issue here is that in their quest to figure out what went wrong with HealthCare.gov, politicians are really seeking to figure out who to blame—and ensure that blame does not fall on them.


Verizon Terremark may have indeed dropped the ball on HealthCare.gov, but they are hardly the only ones. Pushing them out now may be necessary—none of us are fully privy to the mess that's been made—but it seems wildly counterproductive to lose one web-hosting provider and force a transition to another one at a time when many other things need to be fixed.


It's like asking a patient to be moved to another hospital, while he's in the middle of open-heart surgery.


Image courtesy of Shutterstock.






from ReadWrite http://readwrite.com/2013/11/28/terremark-removed-from-healthcaregov

via

Fiber-Optic Networks May Be NSA's Back Door Into Secure Data Centers

Data centers are regarded as the Fort Knoxes of the digital age: heavily guarded and impregnable to any intruder that tries to get their hands on the data within.


So how does a government agency like the NSA manage get a hold of data from the likes of Google and Yahoo? Easy. Instead of taking the gold from Fort Knox, the intelligence agency may be hijacking the data on the road, an Internet highwayman that preys on the one vulnerability every data center has: data has to go somewhere eventually.


The "road," in this case, are the fiber-optic cables that comprise the backbone of the Internet. According to the New York Times, Google and Yahoo are increasingly suspicious that Level 3 Communications, which provides the Internet cables for the two Internet service vendors, is allowing the NSA to grab data in transit between data centers.



...[O]n Level 3’s fiber-optic cables that connected those massive computer farms—information was unencrypted and an easier target for government intercept efforts, according to three people with knowledge of Google’s and Yahoo’s systems who spoke on the condition of anonymity.



Level 3 isn't the only company that runs the fiber-optic cables: companies like BT Group, Verizon Communications and Vodafone Group are in this category as well. It is not known for sure whether any of these companies are actually providing access to intra-data center communications, but given the NSA use of secret warrants with attached gag orders to subpoena data directly from data center providers, it's not seem all that far of a leap to think that the agency is doing the same for the network vendors.


The lesson here for all of us who use the Internet? If the government really wants your data, it's going to get it, one way or another.


Image courtesy of Shutterstock.






from ReadWrite http://readwrite.com/2013/11/26/fiber-optic-networks-nsa-back-door-secure-data-centers

via

Apple Busts A Move With PrimeSense Acquisition

Israeli start-up PrimeSense is the latest acquisition for Apple, which has picked up the 3D sensor technology vendor for $350 million.


PrimeSense's technology is one of the key elements of the sensing tech used within Microsoft's Xbox consoles. But it is doubtful that this is a move to try and hobble Microsoft's gaming efforts. Microsoft no doubt has an iron-clad licensing agreement for PrimeSense's contributions to the Kinect system, and a lot of Kinect comes from efforts directly within Microsoft Research.


The acquisition is much more about Apple, which could incorporate the capability to sense body movements into devices like the iPad and Macbook, and perhaps the Bigfoot-like iTV we keep hearing about in "exclusive leaks" to the media.


Control of devices by movement would be a cool addition to the Apple lineup, and we're looking forward to seeing it soon.






from ReadWrite http://readwrite.com/2013/11/25/apple-primesense-acquisition

via

Glass Explorer Contest Winner Takes Aim At Alzheimer's

As Google Glass 2 makes its way to the members of the Explorer program, ReadWrite is happy knowing that one of the newest explorers is a ReadWrite community member.


Earlier this month, ReadWrite held a contest that awarded an invitation to the Glass Explorer program, based on the entry that would have the best potential to do the most social good.


Of all of the qualifying entries, the one that impressed our panel of judges the most came from Nasr Mobin of San Diego, CA.


Mobin's entry was simple and profound. He wants to create an app



...[F]or people who are experiencing Alzheimer's. Not only it will help them with memories they have forgotten (people's face and names and properties, work they have done in a day and other days in past, etc.) that don't have a solid memory [of], it also could provide brain practices at different times through out the day automatically (recommended by their doctor). Eventually it could become their personal memory assistant.



Mobin's idea has merit; a memory assistance device could be an invaluable aid to those suffering from Alzheimer's, and their families.


Mobin has reported he has already received his invitation and placed the order for his own Glass device. We are looking forward to hearing from Mobin and find out how his project is progressing.






from ReadWrite http://readwrite.com/2013/11/22/glass-explorer-contest-winner-takes-aim-at-alzheimers

via

Google Patents Tech That Could Take The 'Social' Out Of Social Networking




Google has patented plans for software that learns how you behave on social networks and can automatically generate suggestions for "personalized" reactions to tweets and Facebook posts.


Originally noted by the BBC, the ostensible goal of the software is to help users keep up and reply to all the interactions they receive, especially critical ones. However, technology like this could be counterproductive; the whole point of social media is to, well, be social, after all.






from ReadWrite http://readwrite.com/2013/11/22/google-social-robot

via

Windows Phone Users Can Finally Experience The Joys Of Instagram




Today Instagram is finally available for the Windows Phone-with just one tiny flaw. You can't actually capture videos in the app, and capturing a photo is complicated.


Though initial reports said that users couldn't capture photos with the Instagram app, our own Dan Rowinski reports that it works just fine-it just takes you to the camera roll first. The app brings users to the external Windows Phone native camera as opposed to an Instagram camera, but most people won't notice as it brings you straight back to the app with the photo you have taken ready to crop and add filters to. Video capturing isn't currently available.


The Instagram app available through the Windows Phone Marketplace claims to be a beta of the app so it is likely that the Facebook-owned photo-sharing platform will issue updates to the app soon.


Instagram wanted to release an app as quickly as possible, so it focused on Instagram's core features and will continue to develop the product to bring additional features in the future.


"Most people upload photos from their camera roll, so with the beta version of the Windows Phone, we're starting with the experience most people already use," a spokesperson for Instagram said.


Instagram had been reluctant to build an app for Windows Phone, but Nokia announced last month that an Instagram app was in fact in the works. Nokia's Lumia 1020 is arguably the best smartphone camera on the market, so it makes sense that the company would want the premier photo sharing service on its phones.


It seems as though Instagram for Windows Phone is optimized specifically for the high-quality images the smartphone cameras produce, or so the manager of the Windows Phone team Joe Belfiore tweeted earlier today.









from ReadWrite http://readwrite.com/2013/11/20/instagram-introduces-app-for-windows-phone

via

Enterprise Data Needs Still High On The Pain List

A lot of people look askance at the idea of big data, wondering if it is more hype than substance. But consider this: Microsoft's most successful corporate acquisition to date is a company that does nothing but manage the huge growth of data for customers.


The data pressure, whether you buy into the hype or not, is most definitely on for companies, and StorSimple is one of the vendors seeking to alleviate that pressure.


This week marks the one-year anniversary since the crew in Redmond formally acquired StorSimple, and because of the phenomenal success the StorSimple acquisition has had, Microsoft is taking a little time to celebrate.


StorSimple, which was founded in 2010, is a hardware storage gateway vendor that specializes in tiered storage. Tiered storage is a way to organize storage needs in location or needs priorities. One common example of tiered-storage use is to store lesser-used archival "cold" data in a public cloud, while keeping more-used "hot" data stored locally for faster access.


Because the tiers are transparent to end users, the distribution of data is seamless. If the resources on the public cloud are large enough or use good, elastic policies to expand on demand, then users essentially get "a bottomless file server," according to Microsoft Corporate Vice President Brad Anderson.


Anderson, who oversees Microsoft's Windows Server and System Center products, isn't what you would call subdued about StorSimple's success to date.


"In the six months prior to the Microsoft acquisition, versus the six months post-acquisition, StorSimple did seven times the business," Anderson related in a recent interview.


Or put another way: in those six months after the acquisition, StorSimple was hitting numbers it wasn't expecting to hit until three years after it was bought.


Data Is Life


Numbers aren't usually a focus of the story, but they are important to mention in this instance, because they are a solid piece of evidence that suggests there is more to this data explosion than mere hype.


Anderson said that many of their customers are seeing data growth rates of 40-50% per year, and that storage is the fastest growing line item for data center budgets. That feels pretty market-y, but it does match the ongoing conversations people have been having in the enterprise about data management and storage.


It is easy to see "data" and start thinking "big data"—with all the attendant analytics, application development and magic pixie-dust that big data hype will usually bring. But even regular data needs—the kind that just needs to be stored for business, backup or disaster recovery reasons—must be managed.


And, ideally, without breaking the budget. Anderson highlighted the City of Palo Alto, Calif., which jumped to StorSimple after reviewing Storage Area Network options that would have run the city $250,000. The bill for StorSimple was closer to $60,000.


Cost is a big driver for StorSimple customers, and so is flexibility. The seamless approach described earlier is a boon not only to end users but to their IT managers as well. It also gets them exposure to using a public cloud-based platform.


That doesn't bother Anderson: 50% of new StorSimple customers are using Windows Azure for the first time. Data storage in the cloud could be the gateway drug for enterprises and small- to medium-sized businesses to more cloud computing use later.


As entire ecosystems continue to grow around big data and data analytics, companies are still very much seeking the less-flashy but still critically important, tools to manage everyday data. Businesses still have work, and now more than ever, even "ordinary" data is a company's life blood.


Image courtesy of Shutterstock.






from ReadWrite http://readwrite.com/2013/11/20/enterprise-data-needs-still-high-on-the-pain-list

via

Google And Microsoft Put Differences Aside To Fight Child Porn

Google Chairman Eric Schmidt is still assuring users and politicians in the UK that Google is working hard to combat the problem of child pornography—and he's also giving credit where credit is due to Microsoft.


In a Daily Mail article, Schmidt outlined the steps Google is taking to rid its search engine results of exploitive images of children. These steps include deterrence, by showing warnings crafted from Google and other charities that will pop up anytime enters a search term seeking such images.



Microsoft and Google are teaming up on the detection and removal steps. Once illicit images are correctly identified, they will be digitally tagged with a unique fingerprint.


"This enables our computers to identify those pictures whenever they appear on our systems. And Microsoft deserves a lot of credit for developing and sharing its picture detection technology," Schmidt wrote.


The third step in Google's plan is providing technical support to organizations such as Internet Watch Foundation in the U.K. and the U.S. National Center for Missing and Exploited Children.


All of these steps are positive moves forward in the fight against child pornography, though they are very much geared to what Google and Microsoft can actually do: get the illicit images off search networks, which is the first step towards eliminate them altogether.






from ReadWrite http://readwrite.com/2013/11/19/google-and-microsoft-put-differences-aside-to-fight-child-porn

via

Raspberry Pi Vaults Past 2 Million Sold Mark

It's a computer, but there's no monitor. Or fan, or keyboard, or even a case, for that matter. But the credit-card sized Rasperry Pi is still getting snapped up by consumers: less than two years after the first Pis shipped, over two million have been sold.


Raspberry Pi falls into a category of computing device known as a miniboard, where the bare components of a computer—processor, video interface, USB ports and memory are lashed together on what amounts to a circuit board.


But from such a simple device, many things can be created. By plugging in external storage, a monitor, and a keyboard, users can have a Linux computer running in minutes. Or build sophisticated electronic devices like a media stramer or an Internet radio.


The flexibility of Raspberry Pi is certainly an attractive feature. So, too, is the price. The two models of the Raspberry Pi cost $25 for the Model A and $35 for the Model B. Both models feature a 700-MHz ARM processor on a Broadcom system-on-a-chip board, with 256 MB of RAM and an SD/MMC/SDIO card slot for onboard storage. The big difference between the two models is that the extra $10 will get you a 10/100 Ethernet port and a second USB port in the Model B.


Bringing Code To The Masses


Two million devices sold is quite an achievement for a project that has its roots in trying to decrease computer illiteracy.


In 2006, team members in the University of Cambridge's Computer Laboratory in the United Kingdom noticed a sharp decline in computer skills in A Level students accepted into their program. Worse, it was a trend they could see being repeated in other nations besides the UK.


Despite the proliferation of personal computers, or perhaps because of it, kids were no longer playing around or experimenting with PCs. Instead, they were using apps as they were presented, or just buying and downloading new ones to do what they wanted. Hacking and coding, it seemed, was going out of style.


The Cambridge team, lead by designer Eben Upton, began to put together a small, portable, and very inexpensive device that would boot right into a programming environment. From there, a student of any age could start coding to their heart's content.


By 2008, the device now known as the Raspberry Pi had completed the design phase and was ready for production. The Raspberry Pi Foundation was founded that year, and after three years of fundraising and production, the Pi devices were rolling off of the assembly line in February 2012.


The team is stunned by the project's success, even as they work on improvments to the popular miniboard device.



We never thought we’d be where we are today when we started this journey: it’s down to you, our amazing community, and we’re very, very lucky to have you. Thanks!



Image courtesy of Wikimedia.






from ReadWrite http://readwrite.com/2013/11/19/raspberry-pi-vaults-past-2-million-sold-mark

via

Watch Where You Write

When you put time and creativity into sharing your thoughts online with an audience, do you care about fully owning where that experience takes place?


Increasingly, writers are turning to third party platforms to host their works instead of maintaining their own blogs. They're finding readers at places like Medium, Svbtle, Twitter, and Facebook. I'm one of them, but lately I've been wavering over where I want to put my thoughts down on digital paper.


The platforms mentioned above have built remarkable communities, but they are also double-edged swords. On one hand they can bring a massive audience to an unknown writer, and they're also much easier to maintain from a technical perspective. On the flip side, whoever is packaging your material for you can flip a switch at any time and change the context of your creative output and how your readers access it.


A few factors to consider when choosing to house your creative endeavors on a platform you don't own:



  1. You can only implement technology that the platform lets you. While it may be constantly evolving, from a technical perspective you don’t have full control over how you share media or which technical integrations you use.

  2. How you make money (and how much) is largely determined by rules you have no part in making. While those rules treat everyone (all contributors) equally, not everyone is an equal in terms of the talent or audience size they bring to the table.

  3. If a platform disappears tomorrow, or does something that you don’t like, you and your fans aren’t easy to migrate away. It’s not your platform, you’re just using it. This happened to me on Posterous. More on that below.


Now obviously many platforms provides a great service to content creators by offering free hosting for content, that’s huge, but they do it a cost. Typically that cost is using your audience to advertise to. Maybe it's today, maybe it's in the future.


Social networks and blog collectives serve a very important purpose, but they should not be seen as a canvas an artist feels required to paint on. Centralizing content in one place makes it easy for audiences to discover you, but it also turns you into just another shop in a mall, competing for attention along with with the smell from Cinnabon and the guy selling bedazzled phone cases.


So why have I been drawn to specific platforms?:


Medium


I was one of the first few hundred users of Medium through a bit of luck. I loved the traffic spikes that came with getting a story posted to the homepage. I also believe the CMS (content management system) is the most beautiful minimalist writing prompt I've ever used.


I've had a chance to interact with lots of people from the bottom to the top of the organization and they sought my advice and implemented some of it. I love what they're doing and if I'm going to cheer on any third party platform it's them. That said, although I briefly considered making Medium my primary venue for writing, I quickly switched gears because I just don't have enough control over how the site presents my work.


Svbtle


Much like Medium, Svbtle built a userbase by curating excellent writing from a notable group of people who generally have large social networks. I really enjoyed the content, but was quickly turned off by a lack of communication between the team running the site and the community they were trying to build.


There was a strange inhuman quality to Svbtle, and I don't feel like that's changed much since I first encountered it. Ultimately, if there's a race to see which new blogging platform "wins," I'm not planning on betting on the Svbtle horse. From my perspective it's just a simple platform that offers little more than a clean design for people who don't want to manage their own website.


Posterous


I had always hosted my own sites, but one day Posterous showed up and offered simple posting (via email) and a powerful community. In many ways it was a precursor to Medium and Svbtle. It was so unique that it actually spurred a friend and I to start a project called "the3six5," a public diary that let a different person write an entry every single day for an entire year.


That project ran for 1000 days in a row and we accumulated 365,000 words from people all over the world. And then Posterous was shut down. While the data was saved, the Internet shrine we had built was essentially bulldozed. The trust we had put in a third party platform burned me pretty bad here.


Twitter


As of writing this post, it's my six year anniversary on Twitter. I've tweeted 65,000 times averaging about 30 tweets per day. Besides the painfully depressing realization that I could have done so much more with my life than this, it's clear that Twitter has successfully convinced me to trust a third party platform with my words.


Perhaps the brevity that comes with 140 characters makes tweets seem less significant or upsetting to lose, but as of now, I've spent more time writing on Twitter than any digital venue I've ever touched. I will continue to trust Twitter with my work, but only because I can download it at any time.


In the last few years writers have definitely started migrating away from their own domains, but will simpler content management systems and an increased competency in web development swing things the other way?


Where do you prefer to write? Where do platforms like Tumblr and Squarespace fit?






from ReadWrite http://readwrite.com/2013/11/13/own-your-own-words

via

Google Glass Adds Music To Its Immersive Services

Surround sound used to be a marketing term for clear, rich music and entertainment delivery. But Google Glass is implementing new music search features that will not just sound that surrounds, but actually integrates it more completely into users' lives.


The new Glass music search service will feature music matching by listening to ambient sound, and music playback ... all controlled by voice. To avoid playback from bothering people nearby, Glass will also have earbuds through which to listen.






from ReadWrite http://readwrite.com/2013/11/12/google-glass-adds-music-to-its-immersive-services

via

Post Office Starts Sunday Delivery Service For Amazon

Even as the United States Postal Service wants to scale back on its normal delivery schedule, the mail service is picking up a little extra Sunday business on the side.


The USPS announced a new pilot program for the New York and Los Angeles metro areas that will deliver packages for Amazon.com on Sundays, starting immediately.


Sunday delivery for Amazon will bring the Postal Service a much-needed boost to its revenue, as well as benefit Amazon, which will get reliable and relatively inexpensive shipping service on the one day of the week mail carriers aren't pushing through rain, sleet or dark of night.


Though the expanded service will only be available in the NY and LA areas to start, it is expect the Sunday delivery service will expand next year to other locations in the U.S., such as Dallas, Houston, New Orleans and Phoenix.


Image courtesy Flickr/Akira Ohgaki via CC.






from ReadWrite http://readwrite.com/2013/11/11/sunday-delivery-service-starts-amazon-usps

via

State of the OS: Three Operating Systems, Three Upgrades

As a veteran of many operating system upgrades, I am usually somewhat cautious when it comes to system upgrades, but keeping my data in the cloud has perhaps made living on the wild side a little less dangerous.


I have two desktop computers, an I5 Mac Mini and Lenovo tower also powered with an I5 processor. In addition to OS X, the Mac Mini also runs Xubuntu Linux through VMware Fusion.


During the last ten days of October 2013, I did major upgrades on all three of my operating systems. Over the years I have seen lots strange things happen when doing a single operating system upgrade. I once did a Mac OS X upgrade and it took me a week to get my email to work again. I have done early Linux upgrades and had applications break beyond my ability to fix them. Linux upgrades caused me so many problems that I gave up on the operating system until I discovered Ubuntu.


I don’t have as much experience upgrading Windows systems since I typically have gotten my new operating systems by purchasing a new computer and passing on my old Windows machines to someone else. Still, I lived through the many upgrades to Vista, so I saw networking on my laptop break more than once.


Doing three major upgrades very close together is obviously inviting trouble. However, it is also a good way to measure if we are making any progress in the operating system world.


Enter The Penguin


For years, I lived by the mantra of a “clean install” when upgrading my Macs. This time I decided to go for broke and make the first upgrade on my virtual Linux system, pushing my Xubuntu install up to Saucy Salamander—aka Ubuntu 13.10—the underpinning of Xubuntu Linux.


To be very honest, my Linux upgrade happened behind the scenes with no intervention from me other than typing my administrative password and rebooting Linux. I am sure the Linux folks added a lot to the latest version and I have read the notes, but so far my undiscerning Linux eye hasn’t found anything which looks new. I mostly use the Firefox browser and Thunderbird email client on Linux. They both seem to work the same as they did before. LibreOffice has a few new features, including the ability to embed fonts in documents when sending them to someone else. It is a credit to the Linux folks that upgrading is now so painless. I am happy with my Linux world.


The Race Is On


My experiences upgrading to Windows 8.1 and OS X Mavericks were more interesting.


I had been forewarned that the download for Mavericks could be slow, so I started the download before I went to bed. The next morning when I came upstairs to my office, I found that I had a successful download and OS X Mavericks was ready to be installed. Being a little old school, I used DiskMaker X to make a bootable OS X Mavericks installer on an empty USB drive so I would not have to go through the download again in case there was a need.


Also, to make things more interesting, I queued up the Windows 8.1 download so I could start it at the same time OS X Mavericks started installing. Not that it really matters much, but it turned out Windows 8.1 downloaded and installed quicker than OS X Mavericks finished installing.


Surprisingly, the upgrades went smoothly for both the Lenovo desktop and the Mac Mini. Of course we all know that the fun begins once you start trying to do the same things that were once easily accomplished using your old operating system.


Checking Out Mavericks


One of my least favorite parts of operating system upgrades is having to buy upgraded applications that are broken by the operating system upgrade. Usually there is at least one, and it was not surprising that VMware, my virtual machine client, was the one that broke. Which meant that things were starting not to work well in Linux, though no fault of Xubuntu's.


I checked and found that there was a “new and improved” VMware version that was designed to work with OS X Mavericks. I paid the $49.95 upgrade fee, downloaded and installed the new version of VMware, and my Xubuntu experience was back to normal.


I did the OS X Mavericks upgrade hoping that the new OS would fix a printing problem that developed with Mountain Lion. I have three printers on a network and one of them was showing as available on the Mac, but when I tried to print to it, it would never connect. The same printer worked fine from my Windows computer on the same network. I tried reinstalling it a couple of times but I never could get it to work.


I was pleasantly surprised when I tried to print to the printer under Mavericks and it actually worked. Unfortunately a couple of days later, it quit working so I finally gave up and hooked it to the Mac using USB while having the Windows machines access the printer through Ethernet.


So far I have only had one crash on my Mac running Mavericks. It was the old version of Pages and it has not happened again.


Windows On The World


How did the Windows system upgrade fare? Actually, things seemed to go very well until I tried to upload some photos using the built-in SD slot on my Lenovo tower. The SD slot did not work.


I rebooted and it worked, but the next time I tried it, it would not work again. I plugged in an external SD reader and it seems to work fine. It is actually a little easier to reach that than the slot in the tower so I may just ignore the problem.


I did have another somewhat scary problem after I upgraded to Windows 8.1: when I tried to wake the system from sleep the next day, I got the message that my system was broken and needed to be taken to a dealer. I rebooted, the message went away, and so far the problem has not reappeared. My fingers remain crossed.


So far on Windows, all my applications are working and Windows 8.1 is still the same multi-personality OS that it was before. I use Start8, so I mostly ignore the new Windows 8 interface on my desktop machine. I do use the touch features on my Lenovo Yoga which I have not upgraded to Windows 8.1.


The Biggest Changes


Of all the changes in the three operating systems, the one that tried to change the way I work the most was the new default way that second screens are used on the Mac. My Mac desktop has two screens and after a few days I decided the new setting which gives each screen its own Space just would not work for me. I found the solution buried deep in the Mission Control preferences. There is a check box that lets me change back to the old way where a single Mac window could stretch across two screens.


I haven’t used the new iWork suite extensively, but with no support for linked text boxes, it is definitely not the same Pages. I am most impressed with iWorks in the Cloud. It seems to be a nice balance of speed and functionality. I even got it to work from a browser in Linux. I have tried opening a couple of RTF format documents with Word and iWorks. iWorks looks like it might be speedier. I have been told the formats of the new version of iWorks are not backward compatible with old versions but you can export the new versions to the old format.


All in all, congratulations should go out to the folks who have brought us these modern operating systems. My triple roll of the upgrade dice was definitely made on a hot table.






from ReadWrite http://readwrite.com/2013/11/11/state-of-the-os-three-operating-systems-three-upgrades

via

Stability In An Uncertain World: Adding A Nine To Your Cloud Platform Availability


This guest post from David Thompson, principal DevOps engineer at MuleSoft.



Nothing lasts forever. This is certainly true for infrastructure, and it's most poignantly obvious in the public cloud, where instances churn constantly. Your single-node MySQL service? Not long for this world, man. That god-like admin server where all your cron jobs and 'special tools' (hacks) live? It’s doomed, buddy, and it will take your whole application stack with it if you’re not careful.


One question that came up recently within the DevOps team here was: “Given the AWS EC2 service level agreement (SLA) of 99.95, how do we maintain an uptime of 99.99 for our customer applications?” It’s an interesting point, so let’s explore a few of the principles that we’ve learned from building a platform as a service to maintain higher availability than our IaaS provider.


Consider a simple-but-typical case, where you have three service components, each one having a 100% dependency on the next, so that it can’t run without it. It might look something like this:



You can calculate the expected availability of this system pretty easily, by taking the product of their individual availabilities. For instance, if each component is individually expected to hit three nines, then the expectation for the system is (.999 * .999 * .999) = .997, failing to meet a three-nine SLA.


Redundancy and Clustering: Never Run One Of Anything


In order to break into the high-availability space, it’s critical to run production services in a redundant configuration; generally, you should aim for at least n+1 redundancy, where n is the number of nodes needed to handle peak load for the service. This is a simplistic heuristic, though, and in reality your ‘+1’ should be based on factors like the size of your cluster, load and usage patterns, and the time it takes to spin up new instances. Not allowing enough slack can lead to a cascade failure, where the load spike from one failure causes another, and so on until the service is completely inoperable.



We typically run all of our edge (i.e., externally facing) services as stateless Web apps behind Elastic Load Balancers. This allows a lot of flexibility with regards to replacement of instances, deployment of hot fixes, and the other kinds of maintenance tasks that can get you into serious trouble when you’re running a SaaS solution. The edge services are backed by a combination of persistence solutions, including Amazon RDS and MongoDB, each of which provides its own redundancy, replication and failover strategy. Instances for both API and persistence services are distributed across multiple EC2 Availability Zones (AZ), to help prevent a single AZ failure from taking out an entire service.


Loose Coupling And Service-Oriented Architecture


If you decouple the services so that each one is able to function without the others, your expected availability improves, but it also becomes a lot more complicated to calculate because you need to consider what a partial failure means in terms of your SLA. An architecture like this will probably look a little messier:



The diagram above shows a typical case where you have several different services, all running autonomously but consuming each other's APIs. Each of these blocks represent a load balanced cluster of nodes, with the request blocking calls in red and the asynchronous processing in black.


One example of a service-oriented architecture (SOA) that might be structured like this is an eCommerce system, where the synchronous request consumes billing and inventory services, and the asynchronous processing is handling fulfillment and notifications. By adding the queue in the middle, you can decouple the critical calls for the purchase process; this means that S2 and S4 can have an interruption, and the customer experiences no degradation of service.


Since we’re running a platform as a service (PaaS), we have different SLA requirements for customer apps versus our platform API services. Where possible, we run customer apps semi-autonomously, maintaining a loose dependency between them and the platform services so that if there is a platform service outage, it doesn’t impact the more stringent SLA for customer applications.


TDI: Test Driven Infrastructure


Monitoring is really just testing for infrastructure, and like with application code, thinking about testing from the beginning pays huge dividends in systems architecture. There are typically three major categories of monitoring required for each service architecture: infrastructure, application stack and availability. Each one serves its own purpose, and together they provide good visibility into the current and historical behavior of the services and their components.


For our public cloud infrastructure, we’re using a combination of Zabbix and Pingdom to satisfy these different monitoring needs. Both are configured to trigger alerts using PagerDuty, a SaaS alerting service that handles on-call schedules, contact information and escalation plans.


Zabbix is a flexible, open source monitoring platform for operating system and network level metrics. It operates on a push basis, streaming metrics to collectors that aggregate them and provide storage, visualization and alerting. Also—and critically in a public cloud environment—Zabbix supports automatic host registration so that a new node can register with the aggregator with no manual intervention.


Pingdom looks at services from the opposite perspective, i.e., as a list of black boxes that it checks periodically for certain behaviors. If you have carefully defined your SLA in terms of your APIs and their behaviors, then you can create a set of Pingdom checks that will tell you factually whether your service is meeting its SLA, and even create reports based on the historical trends.



A PaaS also needs another layer of monitoring: internal platform monitoring. The platform checks the health of each running customer app on a periodic basis, and uses the AWS API to replace it automatically if something goes wrong. This makes it so that there is a minimal interruption of service even in the case of a catastrophic failure, because once the app stops responding it is soon restarted. Internal health checks like this are application specific and require a significant technical investment, but provide the best auto-healing and recovery capabilities because the application has the most context regarding expected behavior.


Configuration As Code


It’s also critical to know what your hosts are running at all times, and to be able to spin up new ones or update existing ones at a moment’s notice. This is where configuration management comes in. Configuration management lets you treat configuration as code, committed to GitHub and managed just like any other repo.


For configuration management, the DevOps team at MuleSoft uses SaltStack, a lightweight remote execution tool and file server written in Python and based on ZeroMQ that provides configuration management as an intrinsic feature. Combined with AWS' CloudFormation service for infrastructure provisioning, this creates a potent tool set that can spin up, configure and run entire platform environments in minutes. SaltStack also provides an excellent remote execution capacity, handy under normal circumstances, but critically valuable when trying to make a sweeping production modification to recover from a degradation of service.


As an aside, the combination of IPython, boto and the Salt Python module provides an amazing interactive CLI for managing an entire AWS account from top to bottom. More about that in a future article.


Low-Risk Deployment


It’s probably painfully obvious to anyone in the software industry, and especially to anyone in DevOps, that the biggest risks and the biggest rewards are always packaged together, and they always come with change. In order to maintain a high rate of change and high availability at the same time, it’s critical to have tools that protect you from the negative consequences of a botched operation. For instance, continuous integration helps to ensure that each build produces a functional, deployable artifact, multiple environments provide arenas for comprehensive testing, and red/black deployment takes most of the sting out of a failed deployment by allowing fast failure and rollback.


We use all of these strategies to deploy and maintain our cloud infrastructure, but the most critical is probably the automated red/black deployment behavior incorporated into the PaaS customer app deployment logic, which deploys a new customer app to the environment, and only load balances over and shuts down the old application if the new one passes a health check. When DevOps needs to migrate customer apps off of failing infrastructure or out of a degraded AZ, we leverage the same functionality to seamlessly redeploy it to a healthy container.


Availability For All


There really is no black magic required in order to set up a redundant, resilient and highly available architecture in AWS (or your public cloud provider of choice). As you can see from our setup, the accessibility of advanced IaaS platform services and high quality open source and SaaS tools allows any organization to create an automated and dependable tool chain that can manage entire infrastructure ecosystems in a reliable fashion.






from ReadWrite http://readwrite.com/2013/11/08/stability-in-an-uncertain-world-adding-a-nine-to-your-cloud-platform-availability

via