Friday, June 20, 2014

THE TOP 10 MISTAKES COMPANIES MAKE WHEN DEPLOYING A DOCUMENT SCANNING SOLUTION

Recent studies have shown that document scanning remains one of the top technology priorities of organizations, regardless of their size or their vertical market. Little wonder: when properly deployed, document scanning solutions deliver a tremendous payback – in terms of cost savings, faster turnaround, better quality, streamlined compliance and more. But mistakes during system implementation can undermine even the strongest business case for a document scanning solution. To help keep your document scanning project on the right track, here are the 10 most common mistakes that organizations make when deploying the technology – and how to avoid them.

1. Not buying enough scanning capacity
Too often, organizations use their average document processing volumes when determining scanning capacity. But organizations must also consider variables such as peak daily volume, required or contracted deadlines for completing work, and the effective throughput (not the advertised speeds) of the scanners that they are considering buying.

2. Not including all stakeholders both business and IT in the requirements definition
In some cases, the IT department will unilaterally choose the organization’s scanners, in turn, saddling operations with scanners that don’t meet their requirements or are not easy for operators to use. In other cases, an operations team will select scanners without the IT department’s involvement only to discover that the organization’s legacy systems and/or infrastructure can’t support the scanners.

3. Buying a solution without conducting a proof of concept
Organizations should never purchase a scanner without first seeing how it processes their documents. Too many organizations buy document scanning technology based on what they read in a brochure or see on a trade show. They need to test whether the scanner fits their business requirements and processing environment. Organizations also want to have some of their operators run the scanner to test usability.

4. Making decisions on front-end and back-end systems separately
An organization’s front-end scanning and capture solution must work in concert with its back-end workflow technology. For instance, organizations must ensure that their document scanning and capture solution can output images and data in the format required for back-end systems, whether it’s flat files, XML files, Excel spreadsheets or database output. In one case, a BPO purchased a scanning and capture solution that could only output images and data in one format. This resulted in the BPO spending a lot of time and money reconfiguring the output to the various formats that its customers required. Organizations also need to ensure that their back-end systems are fast enough to keep up with their front-end solutions otherwise they will experience bottlenecks in the “hand-off” of images and data. We’ve seen delays as much as 35 to 40 percent between front-end and back-end systems.

5. Not coordinating software and hardware vendors during system deployment
No one wins in this scenario. A lack of coordination typically results in wasted effort, finger pointing and delayed implementations. We have seen many cases where front-end and back-end solutions providers get their systems up and running at a customer site, but there is no integration because the vendors and the customer never discussed critical issues such as: what data needs to be passed from one system to another, the image formats required for back-end systems, and how data should be routed.

6. Not using a phased implementation approach
In their drive for fast results, too many organizations bite off more than they can chew when implementing a scanning solution. Trying to deploy an entire system at once can overwhelm internal resources, and draw out the deployment, in turn, putting the entire project at risk of getting shut down. Instead, organizations should determine where they can have the biggest impact on their operations with the least amount of change; they shouldn’t break a process that isn’t broken. With an initial success under their belts, users should similarly prioritize the next phases of their implementation based on their potential benefits.

7. Letting fear of change take over
Too many organizations are close minded when it comes to re-engineering their processes, falling back on the way they’ve done things for the past five or 10 years. For instance, some organizations manually count every document that they scan, and write a number on the first page of each batch. This process was necessitated by older technology that was prone to double-feeds or didn’t have automatic document counters. However, there is no need to do this with today’s scanning technology, and continuing to do so, creates needless, not to mention costly, work. The best strategy for helping your staff overcome their fear of change is to let them see the technology run firsthand. Once they see that the scanner detects double-feeds and counts documents, as examples, they’ll recognize the impact it will have on document preparation.

8. Not thinking LEAN
Organizations should always be looking for ways to do more with less. For instance, organizations shouldn’t automatically purchase more of their legacy scanners as their volume grows; there may be other scanners available that enable them to consolidate hardware. Similarly, most organizations can do a more efficient job of document preparation; there’s no need to tape small documents to 8 ½ x 11-inch paper, or to use multiple separator sheets for scanning.

9. Not cutting the paper cord
Many organizations use unique transaction separator sheets for each type of work that they process, creating an enormous breadth and volume of paper. Today’s document scanning solutions are an opportunity for organizations to rid themselves of this paper, automatically separating transactions based on documents (e.g. checks or envelopes) within a batch. The technology also allows organizations to insert generic separator sheets that can be re-used; one company has re-used its generic separator sheets for the past five years, saving significant money.

10. Not sharing as in shared services
With the economy still struggling, and capital budgets tight, organizations should look to consolidate multiple scanning functions on a single platform.

Properly deployed, document scanning solutions deliver tremendous results. But mistakes during implementation can undermine even the strongest business case for the technology. Avoiding the 10 mistakes described above will help ensure the success of your organization’s scanning project.

Excerpt from www.ibml.com

Monday, June 9, 2014

THE TOP TEN STRATEGIC TECHNOLOGY TRENDS FOR 2014. Vol IV

9. Smart Machines Through 2020, the smart machine era will blossom with a proliferation of contextually aware, intelligent personal assistants, smart advisors (such as IBM Watson), advanced global industrial systems and public availability of early examples of autonomous vehicles. The smart machine era will be the most disruptive in the history of IT. New systems that begin to fulfill some of the earliest visions for what information technologies might accomplish — doing what we thought only people could do and machines could not —are now finally emerging. Gartner expects individuals will invest in, control and use their own smart machines to become more successful. Enterprises will similarly invest in smart machines. Consumerization versus central control tensions will not abate in the era of smart-machine-driven disruption. If anything, smart machines will strengthen the forces of consumerization after the first surge of enterprise buying commences.

10. 3-D Printing Worldwide shipments of 3D printers are expected to grow 75 percent in 2014 followed by a near doubling of unit shipments in 2015. While very expensive “additive manufacturing” devices have been around for 20 years, the market for devices ranging from $50,000 to $500, and with commensurate material and build capabilities, is nascent yet growing rapidly. The consumer market hype has made organizations aware of the fact 3D printing is a real, viable and cost-effective means to reduce costs through improved designs, streamlined prototyping and short-run manufacturing.

About Gartner Symposium/ITxpo
Gartner Symposium/ITxpo is the world's most important gathering of CIOs and senior IT executives. This event delivers independent and objective content with the authority and weight of the world's leading IT research and advisory organization, and provides access to the latest solutions from key technology providers.

Julius Macaulay is the Principal Consultant at TECRES Consult - www.tecres.com.ng providing ICT Training and document management consultancy services for schools and organizations. He holds a Masters degree in Information Technology with special interest in the paperless office.

Monday, June 2, 2014

THE TOP TEN STRATEGIC TECHNOLOGY TRENDS FOR 2014. Vol III

5. Cloud/Client Architecture
Cloud/client computing models are shifting. In the cloud/client architecture, the client is a rich application running on an Internet-connected device, and the server is a set of application services hosted in an increasingly elastically scalable cloud computing platform. The cloud is the control point and system or record and applications can span multiple client devices. The client environment may be a native application or browser-based; the increasing power of the browser is available to many client devices, mobile and desktop alike. Robust capabilities in many mobile devices, the increased demand on networks, the cost of networks and the need to manage bandwidth use creates incentives, in some cases, to minimize the cloud application computing and storage footprint, and to exploit the intelligence and storage of the client device. However, the increasingly complex demands of mobile users will drive apps to demand increasing amounts of server-side computing and storage capacity.

6. The Era of Personal Cloud
The personal cloud era will mark a power shift away from devices toward services. In this new world, the specifics of devices will become less important for the organization to worry about, although the devices will still be necessary. Users will use a collection of devices, with the PC remaining one of many options, but no one device will be the primary hub. Rather, the personal cloud will take on that role. Access to the cloud and the content stored or shared from the cloud will be managed and secured, rather than solely focusing on the device itself.

7. Software Defined Anything
Software-defined anything (SDx) is a collective term that encapsulates the growing market momentum for improved standards for infrastructure programmability and data center inter-operability driven by automation inherent to cloud computing, DevOps and fast infrastructure provisioning. As a collective, SDx also incorporates various initiatives like OpenStack, OpenFlow, the Open Compute Project and Open Rack, which share similar visions. As individual SDx technology silos evolve and consortiums arise, look for emerging standards and bridging capabilities to benefit portfolios, but challenge individual technology suppliers to demonstrate their commitment to true inter-operability standards within their specific domains. While openness will always be a claimed vendor objective, different interpretations of SDx definitions may be anything but open. Vendors of SDN (network), SDDC (data center), SDS (storage), and SDI (infrastructure) technologies are all trying to maintain leadership in their respective domains, while deploying SDx initiatives to aid market adjacency plays. So vendors who dominate a sector of the infrastructure may only reluctantly want to abide by standards that have the potential to lower margins and open broader competitive opportunities, even when the consumer will benefit by simplicity, cost reduction and consolidation efficiency.