Tags: big data, infographic, technology
add a comment
Kirsten Newbold-Knipp, Gartner: Here are a few highlights from some of our 2016 marketing cool vendors reports as well as guidance on technology selection. Cool Vendors in Content Marketing: As content marketing grows up from its early tactical success to become a scalable program, marketers need to expand their content pipeline with high quality results. […]
Interesting visual on this topic.
Ramblings on Software Delivery February 28, 2016Posted by Edwin Ritter in Cloud Computing, Project Management.
Tags: release management, release management process, software delivery
add a comment
I expect there will be a time in the not distant future where software development will be easier. So we are clear, I mean predictable, common even.
While not fully equivalent, here is an highly simplified example.The electrical socket outlet is a common fixture found in commercial and residential buildings. This did not happen by accident. The generation of electrical power and subsequent demand caused a great deal of standards, regulation and common design elements. They were required and fundamental to broad use of electrical and ubiquitous availability.
Software best practices constantly evolve and there have been lots of advances from the 1GL, 2GL and 3GL days. Yet, despite those improvements, a recurring challenge on projects for me concerns the final product. By that, I mean delivery and all that goes with it. Not in any order, but these include customer acceptance, integration testing, launch, cut-over, final QA. From my experience, what should be common and predictable instead is anything but. Each project has unique challenges but the over all release process is the same.
And, yet. Here we are with 4 and 5GL programming and the fundamental challenges that remain to be solved. Code objects are inter operable; APIs are well understood but ad-hoc. Seems to me that there should be more common functions and libraries that provide a lot of the basics. And they do – to a point. But not at the same level as electrical outlet.
Perhaps the challenge is more with interfaces and data types (schemas). But I digress. Delivery and integration of software remains a crucial phase (see delivery above). Imagine a time when delivery is the same as generating power and connecting to the ‘grid’. Delivery is then driven by demand and usage among other things. Yes, cloud computing comes to mind and that has made things easier.
Which leads me to release planning using Agile practices. A good step forward toward common delivery. Getting more consistent release process is always preferred to ad-hoc. I have earlier posts about this topic and I expect there may be future ones as well. It is an evolving practice and gets better all the time.
ITIL outlines release management best practices. I know that the state of the art in software development will get to the electrical power example I mentioned earlier. The general consensus is we have achieved a basic process for delivery. Understanding current state, defined and consistent intervals,use standards and automate when and where ever possible are part of best practices.
What other challenges will remain going forward? How will standards be revised, improved to get to the electrical power equivalent state?
Cloud Life in 2015 February 27, 2015Posted by Edwin Ritter in Cloud Computing.
Tags: cloud, cloud computing
add a comment
Are we there, yet? Seems everyone is talking about cloud services these days but not everyone has migrated. Once you and your team have configured the repositories, gotten used to the tools and established new or revised processes, life is good. The migration can vary of course from easy to arduous. One thing that seems to get glossed over is moving things from here to there. Things like project assets (images, videos, content – you know, stuff), documentation, spreadsheets. I know, I know – it’s very easy. Drag and drop. It is easy, to a point.
While there is a lot of talk about cloud services, it seems to me we are in the early stages of life in the clouds. We are still using desktop based tools, not cloud based widgets. I imagine a day when setting up repositories will be done for groups of files and directories using objects. Content management via metadata and aligned with a specific taxonomy. What’s that? It exists already? Yes, it does in certain places but it is not ubiquitous and not homogenous (yet). Too often, we are dealing with unique files, not a larger data set.
In the interim, it’s about the journey not the destination and keeping the business running while you migrate. Tools change all the time and there will be a day when migrating/updating/changing repositories will be done at a higher level than it is today.
And, hey, you – get off of my cloud.
Virtual Work January 30, 2015Posted by Edwin Ritter in career, Cloud Computing.
Tags: life-work balance, remote work, virtual office, virtual work
add a comment
While in the office, how many times have you heard someone ask the old saw ‘Are you Working hard or hardly working?’ Funny when used at the right moment and a proven gambit to spark friendly banter among co-workers. In the digital age that is reflective of current work practices, a variation to the phrase could be ‘Are you working virtual or virtually working?’
The difference is more subtle than you might think. Working virtual can mean working remotely. That is, you may not be in the office for a short time but will be back physically. The expectation of management is that while remotely working (wherever that may be), you will be working on the same tasks as when in the office. Virtual work is conceptually similar but different in that you are never physically in an office with co-workers. No office cubicle, no ad hoc hallway or water cooler conversations, no in person meetings or face time.
There are definite advantages to virtual jobs. Top of mind advantages include:
- life-work balance,
- minimal commute time and
- flexible work hours.
Having a virtual job is not for everyone and certainly not possible with all jobs. It does take different habits with virtual work. Making the transition to working in a virtual office also requires a change in mind set. Being comfortable with having lots of alone time is part of the transition. The interactions with co-workers is reduced and takes a bit more effort (and, time) to pose a question, have ad hoc conversations via chat or email.
In response to the question, I am virtually working and enjoy what I am doing.
Ramblings on the Programmable World June 6, 2013Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: big data, connections, data mining, data usage, network
I am an active reader of Wired and enjoy articles that deal with emerging trends. From a recent issue, I found an a write up that relates to big data. In the near future, data will be generated by every day items. The main concept of the piece deals with connecting our analog devices such as a refrigerator (and, everything else) to a network. The idea is not a far fetched as it used to be. This is not limited to active devices either. Passive devices, such as doors and windows can be connected as well. They will be talking to us and each other. All of this is the early stages of a programmable world and will take some time to sort out.
We know from Moore’s Law that electronic devices get cheaper all the time. Here is a direct link to the article in Wired that describes how will be connecting sensors to physical devices and integrating them into the programmable world.
According to the author, there are three stages. The first involves getting devices on the network. The prospect of generating, monitoring this data and then triggering events as a result will lead to applications not within our purview currently. And, this it not just for residential use; factory automation can be taken to a new level. Bringing the analog world into the digital age. Having a system to collect and arrange the data is required. Think about when every device in your house is connected to your Wi-Fi network. The second stage will have those devices on the network sync with each other – output from one device triggers an action in another. The third and final stage involves using these devices as a system or single platform. To make this work, we will need repeatable and consistent patterns. The first generation will be crude and will not handle exceptions well. We will get smarter about that and iterate on stimulus and response triggers.
We have experienced discrete pieces of connected devices into disparate networks already. Wireless (aka, WiFi), Bluetooth, Radio Frequency Identification (RFID) and Near Field Communications (NFC) are routinely used in security badges, printers, cameras, smart phones and tablets. The next iteration will be connecting them together.
There is a great potential here with the number of devices to connect in the trillions. A big number. Each device generating data based on stimulus. Management will require programming – lots of programming and networking to make this all work together. Big companies are looking at this including Qualcomm, Cisco, GE and IBM as well as start ups are working on this as well. Changes will be seen in the home, factory and the office.
To revise an old phrase, this isn’t your Father’s network. Devices with embedded sensors will generate a lot of chatter. Who is going to listen? Where will that data be stored? What standards will be needed for command and control? We will get that sorted that out and then look for the next challenge.
This is why big data is the sweet spot for SaaS May 15, 2013Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: big data, cloud, cloud computing, data mining, metrics
add a comment
Big data sweet spot is in software as service (SaaS).
People often ask me where the smart money is in big data. I often tell them that’s a foolish question, because I’m not an investor — but if I were, I’d look to software as a service.
There are two primary reasons why, the first of which is obvious: Companies are tired of managing applications and infrastructure, so something that optimizes a common task using techniques they don’t know on servers they don’t have to manage is probably compelling. It’s called cloud computing.
The other reason is that the big part of big data really is important if you want to get a really clear picture of what’s happening in any given space. While no single end-user company can (or likely would) address search-engine optimization, for example, by building a massive store comprised of data from hundreds or thousands of companies as well as the entire web, a cloud service…
View original post 1,032 more words
Ramblings on robots February 3, 2013Posted by Edwin Ritter in Cloud Computing, Project Management.
Tags: career, jobs, robot, robots, technology
The topic of robots and automating work via machines has gotten a bit of ink lately. In the last month, I have read several articles on how robots will replace humans. Perhaps you’ve seen them also. The premise, or promise, is that machines will replace many of the tasks currently done by humans. A recent issue of Wired has robots as their cover story titled Better Than Human. The question is not if, but more of when, robots will replace people for many of the jobs that exist today. The major assumption is that it will create new jobs for us carbon-based life forms. The impact with machines used instead of people to perform a task is also connected to big data and cloud computing.
The concept is not new, of course. It can be argued that machine automation started with the Industrial Revolution as machines performed what humans did previously. Benefits in use of machines include consistent, repeatable actions, improved forecast of turns (i.e. – throughput), working with known capacity, higher quality, less waste and more accurate delivery. Having the machines in place provide humans to focus on other aspects of running a business.
From an economic and budget perspective, we know that the human element is the highest cost in any process. As Moore’s Law still works, the cost to use machines make more budgetary sense. This type of disruptive change will bring uncertainty, fear and confusion initially. At least, to us humans. To the machines it would be a non event and they might just say “meh”.
Any speculation I have at this point would be just that, speculation on how this will play out. However, I do look forward to what new jobs will be created by robots. Having a bot take over what I do now would be great. When that happens, I will then be able to define a process or sequence of operations for one or more bots, aligning those resources to perform that work I have assigned to them. No feedback, no personal issues, no drama, just predicatbel result. I won’t have to schedule meetings, take and distribute notes or ask them for critique of my performance either. Hmm, this could be a really good thing. My future job description may include more think time to improve/define innovation.
When will this happen again? When it does, will you be ready?
Ramblings on the Personal Cloud November 18, 2012Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: cloud computing, computing, dropbox, google docs, personal cloud, SmartPhone, tablet, tablets
Changing of the terms – Life in the cloud will redefine many things as we go forward. One technical term that will change is PC. We all know that PC initially was defined as personal computer. That is the accepted 1.0 definition. For the 2.0 version, it stands for personal cloud. This is the third and final post of the technology trends I am watching this year. Previous posts covered big data and cloud computing.
The modern internet provides many cloud (or, web hosted) services that are easily personalized to satisfy our our needs. One example is Google Docs. Storing documents in the cloud makes them accessible from multiple devices. Another cloud-based service is Dropbox. These and the multiple others like them store information in the cloud as opposed to a local or hard drive or internal network folder. Cloud services make it very easy to share information – send the URL. An added benefit is sharing the URL does not clog up your email in-box with lots of attachements. Many reliable cloud services are free and the subscription fee-based services provide additional capability.
I like life in the cloud and have had my personal cloud for some time now. The convenience of accessing a file from multiple devices is a wonderful thing. Sharing data with family and colleagues is easy and quick – simply provide the URL.
I submit that the personal cloud has a major impact for owners of tablets and smartphones. Where a home or work computer can store information locally on a hard drive, tablets and smartphones have limited storage space. Using the cloud for storage makes this a non-issue.
How is your personal cloud? Do you use these services without thinking about it? Will it change where you store/share information?
Ruminations on processing in the cloud October 14, 2012Posted by Edwin Ritter in Cloud Computing.
Tags: cloud computing, mobile, SmartPhone
Among the technology trends this year, I am focused on three in particular. I have previously covered big data. In an upcoming post, I wil look at the personal cloud. This post talks about mobile devices and the BYOD phenomenon. The term stands for “Bring Your Own Device”. At work, we increasingly use our smart phones to access applications, email and stay in sync. The big assumption here of course is that your IT department can support this. Any time, any place – tracking your calendar updates for meetings, checking email, updating wikis all can be performed easily via your smart phone. For road warriors, using your existing smart phone is now routine. Office workers are realizing the advantages of being able to access data outside the office as well.
This infographic shows a few ways how this trend is being used. There are real advantages with BYOD – here are a few.
A reduction in the hardware cost is one. Having the employees upgrade to the latest smarthphone, tablet, laptop, etc. eliminates a lot of cost for the organization. The devices are omni-present; the data is constantly available. You are engaged and in touch more often.
Policy changes are part of this phenomena. IT will typically require secure network access. Once you connect to the internal network, you are on your own. Remember your credentials and keep them secure.
After that, enjoy and keep your device charger handy and let the cloud process for you.
Using Big Data to save lives June 11, 2012Posted by Edwin Ritter in Cloud Computing, Trends.
Tags: big data, cloud computing, storms, weather
add a comment
Here’s one way to put big data to good use – saving lives. Taking all that information in collecting, managing and making decisions in real time to assess risk profiles for imminent danger is why we use technology. What other ways have you seen where big data is put to use?
Rice University researchers have built a web-based calculator that predicts the risks associated with hurricanes for a specific address in Houston. The tool uses historical and meteorological data to generate a risk profile for residents of the city in real-time (hat tip Discovery News). As a former Houston resident who has lived through several hurricanes, this is a pretty nifty combination of a variety of data sources into a tool that helps regular people makes decisions.
The tool, which is limited to Harris County, was inspired by the mass evacuations that occurred during Hurricane Rita in 2005. Millions of Houstonians fled the storm and blocked major roadways. Not all of those who left needed too, but absent hard data it’s hard to know what to do if you’re reading about a Category 5 storm heading your way.
View original post 156 more words