jump to navigation

Gartner Marketing Technology Map May 2016 — What’s The Big Data? May 15, 2016

Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: , ,
add a comment

Kirsten Newbold-Knipp, Gartner: Here are a few highlights from some of our 2016 marketing cool vendors reports as well as guidance on technology selection. Cool Vendors in Content Marketing: As content marketing grows up from its early tactical success to become a scalable program, marketers need to expand their content pipeline with high quality results. […]

via Gartner Marketing Technology Map May 2016 — What’s The Big Data?

Interesting visual on this topic.

Ramblings on the Programmable World June 6, 2013

Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: , , , ,
2 comments

I am an active reader of Wired and enjoy articles that deal with emerging trends. From a recent issue, I found an a write up that relates to big data. In the near future, data will be generated by every day items. The main concept of the piece deals with connecting our analog devices such as a refrigerator (and, everything else) to a network. The idea is not a far fetched as it used to be. This is not limited to active devices either. Passive devices, such as doors and windows can be connected as well. They will be talking to us and each other. All of this is the early stages of a programmable world and will take some time to sort out.

We know from Moore’s Law that electronic devices get cheaper all the time. Here is a direct link to the article in Wired that describes how will be connecting sensors to physical devices and integrating them into the programmable world.

Device sensors generating data

Sensors and objects linked together.

According to the author, there are three stages. The first involves getting devices on the network. The prospect of generating, monitoring this data and then triggering events as a result will lead to applications not within our purview currently. And, this it not just for residential use; factory automation can be taken to a new level. Bringing the analog world into the digital age. Having a system to collect and arrange the data is required. Think about when every device in your house is connected to your Wi-Fi network. The second stage will have those devices on the network sync with each other – output from one device triggers an action in another. The third and final stage involves using these devices as a system or single platform. To make this work, we will need repeatable and consistent patterns. The first generation will be crude and will not handle exceptions well. We will get smarter about that and iterate on stimulus and response triggers.

We have experienced discrete pieces of connected devices into disparate networks already. Wireless (aka, WiFi), Bluetooth, Radio Frequency Identification (RFID) and Near Field Communications (NFC) are routinely used in security badges, printers, cameras, smart phones and tablets. The next iteration will be connecting them together.

There is a great potential here with the number of devices to connect in the trillions. A big number. Each device generating data based on stimulus. Management will require programming –  lots of programming and networking to make this all work together.  Big companies are looking at this including Qualcomm, Cisco, GE and IBM as well as start ups are working on this as well. Changes will be seen in the home, factory and the office.

To revise an old phrase, this isn’t your Father’s network. Devices with embedded sensors will generate a lot of chatter. Who is going to listen? Where will that data be stored? What standards will be needed for command and control? We will get that sorted that out and then look for the next challenge.

This is why big data is the sweet spot for SaaS May 15, 2013

Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: , , , ,
add a comment

Big data sweet spot is in software as service (SaaS).

Gigaom

People often ask me where the smart money is in big data. I often tell them that’s a foolish question, because I’m not an investor — but if I were, I’d look to software as a service.

There are two primary reasons why, the first of which is obvious: Companies are tired of managing applications and infrastructure, so something that optimizes a common task using techniques they don’t know on servers they don’t have to manage is probably compelling. It’s called cloud computing.

The other reason is that the big part of big data really is important if you want to get a really clear picture of what’s happening in any given space. While no single end-user company can (or likely would) address search-engine optimization, for example, by building a massive store comprised of data from hundreds or thousands of companies as well as the entire web, a cloud service…

View original post 1,032 more words

Using Big Data to save lives June 11, 2012

Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: , , ,
add a comment

Here’s one way to put big data to good use – saving lives. Taking all that information in collecting, managing and making decisions in real time to assess risk profiles for imminent danger is why we use technology. What other ways have you seen where big data is put to use?

Gigaom

Rice University researchers have built a web-based calculator that predicts the risks associated with hurricanes for a specific address in Houston. The tool uses historical and meteorological data to generate a risk profile for residents of the city in real-time (hat tip Discovery News). As a former Houston resident who has lived through several hurricanes, this is a pretty nifty combination of a variety of data sources into a tool that helps regular people makes decisions.

The tool, which is limited to Harris County, was inspired by the mass evacuations that occurred during Hurricane Rita in 2005. Millions of Houstonians fled the storm and blocked major roadways. Not all of those who left needed too, but absent hard data it’s hard to know what to do if you’re reading about a Category 5 storm heading your way.

For those who stayed, understanding their risk of power failure or wind…

View original post 156 more words

10 ways big data changes everything May 9, 2012

Posted by Edwin Ritter in Cloud Computing, E-Commerce, SEO, Trends.
Tags: , , , , , , , ,
9 comments

As 2012 reaches the half way mark, here is a post on one of this years’ hot topics. This is the first of three.

What is big data? How big is big? Think Yottabytes. So much data is now collected that 90% of the online data was created in just the last two years. Simply stated, everything you do on the web is tracked and creates data. It is then stored, sliced, diced and analyzed. The growth in data is due to proliferation of smart phones and tablets, lower storage costs and improved analytical tools. This article reveals 10 ways in which big data will have an impact.

Gigaom

A yottabyte isn’t what happens when the Jedi master starts gnawing on your leg. It’s the information equivalent of one quadrillion gigabytes, and is enough digital data to fill the states of Delaware and Rhode Island with a million data centers, according to Backblaze. While the world hasn’t yet seen many yottabytes, industries like Internet search, genomics, climate research, and business analytics are starting to create massive data sets — in the peta- and exabyte range — that are requiring an entirely new set of big data tools to manage.

The emergence of this so-called big data phenomenon is also fundamentally changing everything from the way companies operate, to the way people interact, to how the world deals with outbreaks of infectious diseases. On March 21st and 22nd, GigaOM is throwing an event about the future of this big data ecosystem in New York, Structure:Data, and for…

View original post 9,088 more words