Wednesday, February 20, 2019

Describing Enterprise Artificial Intelligence: Platform as a Service With Current AI Infrastructure



If I will talk about the Enterprise AI, then it is hard to think of an application that doesn’t use a database. If you see from mobile to web to the desktop, every modern application relies on some of a database. Some apps use flat files while others rely on memory or NoSQL database.

If I will talk about the traditional enterprise applications, then they interact with large database clusters running Microsoft SQL, Oracle etc. The fact is that every application needs it.

Like databases, Artificial Intelligence (AI) is moving towards becoming a core component of modern applications. In the coming months, almost every application that we use will depend on some form of AI.

Enterprise AI simulates the cognitive functions of the human mind — learning, reasoning, perception, planning, problem solving and self-correction — with computer systems. Enterprise AI is part of a range of business applications, such as expert systems, speech recognition and image recognition.

Figure 1


1. Start Consuming Artificial Intelligence APIs

This approach is the least disruptive way of getting started with AI. Many existing applications can turn intelligent through the integration with language understanding, image pattern recognition, text to speech, speech to text, natural language processing, and video search API.

Let’s look at a concrete example of analysing the customer sentiment in a customer product requirement demo session. Almost all the Customer calls to the service team are recorded for random sampling.

A supervisor routinely listens to the calls to assess the quality and the overall satisfaction level of customers. But this analysis is done only on a small subset of all the calls received by the customers to the service team. This use case is an excellent candidate for AI APIs. Each recorded call can be first converted into text, which is then sent to a sentiment analysis API, which will ultimately return a score that directly represents the customer satisfaction level.

The best thing is that the process only takes a few seconds for analysing each call, which means that the supervisor now has visibility into the quality of all the calls in near real-time. This approach enables the company to quickly escalate incidents to tackle unhappy customers and rude customer service agents.From CRM to finance to manufacturing domains, customers will tremendously benefit from the integration of AI. There are multiple AI platforms and API providers like (With link):


2. Build and Deploy custom AI models in the Cloud

While consuming APIs is a great start for AI, it is often limiting for enterprises.

We have seen the benefits of Integrating Artificial Intelligence with applications, customers will be ready to take it to the next level.

This step includes acquiring data from a variety of existing sources and implementing a custom machine learning model. It requires creating data processing pipelines, identifying the right algorithms, training and testing machine learning models and finally deploying them in production.

Similar to Platform as a Service that takes the code and scales it in the production environment, Machine learning as a service offerings take the data and expose the final model as an API endpoint. The benefit of this deployment pattern lies in making use of the cloud infrastructure for training and testing the models. Customers will be able to spin up infrastructure powered by advanced hardware configuration based on GPUs and FPGAs.

Platforms that offers Machine Learning as a Service:

3. Run Open Source AI Platforms On-Premises

The final step in AI-enabling applications is to invest in the infrastructure and teams required to generate and run the models locally. This is for enterprise applications with a high degree of customization and for those customers who need to comply with policies related to data confidentiality.

If ML as a Service (MLaaS) is similar to PaaS, and running AI infrastructure locally then it is comparable to a Private Cloud. Customers need to invest in modern hardware based on SSDs and GPUs designed for parallel processing of data. They also need expert data scientists who can build highly customized models based on open source frameworks. The biggest advantage of this approach is that everything runs in-house. From data acquisition to real-time analytics, the entire pipeline stays close to the applications. But the flipside is in the OPEX and the need for experienced data scientists.

Customers implementing the AI infrastructure use one of the below open source platforms for Machine Learning and Deep Learning:

If you want to get started with AI, explore the APIs first before moving to the next step. For developers, the hosted MLaaS offerings may be a good start.Artificial Intelligence is evolving to become a core building block of contemporary applications. AI is all set to become as common as databases. It’s time for organizations to create the roadmap for building intelligent applications.

AI Data evolutions like Data Processing and Neural Networks.

Now in present time we are feeding loads of data to the computer, so the computer will learn about Deep learning technologies and the reason for behind this to take AI Initiative.

Neural networks process information in a similar way the human brain does. The network is composed of a large number of highly interconnected processing elements (neurons) working in parallel to solve a specific problem. Neural networks learn by example to solve complex signal processing and pattern recognition problems, including speech-to-text transcription, handwriting recognition and facial recognition.

Data processing is, generally, “the collection and manipulation of items of data to produce meaningful information.” In this sense it can be considered a subset of information processing, “the change (processing) of information in any manner detectable by an observer.”

AI data processing is the need for high-quality data. While data quality has always been important, it’s arguably more vital than ever with AI initiatives.

In 2017, research firm Gartner described a few of the myths related to AI. One is that AI is a single entity that companies can buy. In reality, it’s a collection of technologies used in applications, systems and solutions to add specific functional capabilities, and it requires a robust AI infrastructure.

Another myth is that every organization needs an AI strategy or a chief AI officer. The fact is, although enterprise AI technologies will become pervasive and increase in capabilities in the near future, companies should focus on business results that these technologies can enhance.

Conclusion

In, the Conclusion I would say everyone has to use the Artificial Intelligence Applications and as we found that many existing applications can turn intelligent through the integration with language understanding, image pattern recognition, text to speech, speech to text, NLP and video search API.

As we have seen earlier a supervisor can do the Random Sampling at the time of Customer Product Requirement demo session. So, what supervisor wants to listen in the last about the customer satisfaction and how better we can use the AI APIs to convert the call into text for the sentiment analysis. As, I have mentioned above in this use case we can definitely find out the score directly what is the satisfaction level of customer.

So, with the help of AI APIs we can easily understand the customer problems and give a better product to them for their use.

As, I have mentioned above for the same that we have to build and deploy the custom AI models in the cloud and why it is more useful and beneficial for the customers.

So, if I will define why is it more useful and beneficial as we have seen above the AI APIs will take to the customer to the next level. AI APIs will acquires data from variety of sources and help in to create data processing pipelines and identifying the right algorithm and deploy in production environment.

So, in the last I would say use the AI APIs applications to reduce the pain of customer and help them to reduce the complexity and make the process more effective with the help of AI APIs.

Monday, February 18, 2019

Odata and REST APIs - A Comparison



Are you curious about what is REST and OData? Then, this article is for you. In it, you will find an explanation of what is REST, how it is applied to the construction of RESTful APIs; what is OData and how it is used to build web services, and the differences between the two.

Brief comparison

The table below summarizes the main similarities and differences between REST APIs and OData Services.



Table1: REST and OData comparison

As the table shows, both technologies follow REST principles, although OData is more relaxed in that, if there is a good reason to avoid its use, it lets you do so.

Thus the OData Protocol is more comprehensive than the REST model, as in addition to the use of the REST principles, it provides a way to describe both the data and the data model.

So, let’s go deep into the details by explaining each of the concepts mentioned in the above table.

What is REST

Representational State Transfer, or better known as REST is a software architecture style, which defines a set of principles useful for creating efficient web services.

These principles were described by Roy Fielding in his doctoral dissertation written in 2000. As he stated during an interview:

“…That process honed my model down to a core set of principles, properties, and constraints that are now called REST”

In his dissertation Fielding identified six aspects of HTTP and HTML that made these technologies successful and efficient. These concepts or principles are:

  1. Client–server
  2. Stateless
  3. Cacheable
  4. Layered system
  5. Code on demand (optional)
  6. Uniform interface
    • Identification of resources
    • Manipulation of resources through these representations
    • Self-descriptive messages
    • Hypermedia as the engine of application state (HATEOAS)

The HTTP protocol is an example of a system that implements the principles of REST.

What is a REST API

A REST API is simply an Application Programming Interface that uses the above mentioned principles. Usually, they use XML or JSON to communicate, although the REST architecture doesn’t ask anything specific in this regard. Even more, REST doesn’t demand any particular format, and accepts any format that can be used via Content negotiation.



Figure 1: REST API

What is OData

OData or Open Data protocol is an application-level protocol that describes a way for interacting with data using RESTful services. At present it is in version 4.0. This latest version was standardized by OASIS and approved as an ISO/IEC International Standard.

Initially created by Microsoft in 2007, it was later released under the Microsoft Open Specification Promise, and thus made available to all.

The protocol is based on two main components: 
  1. Industry standards: HTTP, XML, Atom and JSON.
  2. REST-based architecture: HTTP protocol.

However, the specification states that:

  1. OData must “follow REST principles unless there is a good and specific reason not to”.
  2. OData Services MUST support the ATOM encoding.
  3. OData services SHOULD support a JSON encoding.

Figure 2 shows the structure of an OData compliant web service. It rests on HTML technology, which somehow resolves the problem of being REST-based. In addition, it must include Atom encoding, which is based on the XML technology, and also adhere to the OData’s data handling specifications. Since version 4.0, OData also handles JSON.



Figure 2: an OData compliant construct

OData uses Atom or JSON to define the data schema, and it provides two important models for the management of data. They are: 
  1. Entity Data Model or EDM: it is an abstract data model that MUST be used to describe the data exposed by the service.
  2. Service Model: it consist of two static resources and a set of dynamic resources. The static resources provide a way to ask a service about its data model, and the dynamic ones offer methods to manage the data model.

The two static resources are: 
  1. The metadata document, which describes the data model.
  2. The service document, which lists all of the top-level entity sets exposed by the service.

An important feature of the OData protocol is its support for a versioning scheme that enables services to be backward compatible.

In addition, there are several libraries that can be used to facilitate building and consuming OData-based services. Among them, Restier is the main library for the .NET framework, and Apache Olingo is the most important for the Java platform.

Conclusion

In this article we have seen what REST means, how it is used in RESTful APIs and in the OData Protocol. In addition, we have described the OData protocol’s main components.

We have concluded that REST is part of OData, and also more general. Thus, OData can be considered a safer guide to creating RESTful services.

Read more related articles at our blog series

Friday, February 15, 2019

How Integration platform is driving the 4th industrial revolution in 2019



The first Industrial revolution paved way for a new energy source – steam, which pushed industries towards mechanization. It helped industries to improve productivity.

The Second Industrial revolution introduced electricity and other innovations in infrastructure. It created new industries and opportunities to thrive on mass production lines.

When we arrived at the Third revolution during the late twentieth century, information technology was the major change agents towards the automation of production processes.

And here we are now, the next big thing in the history of the industrial revolution – The Fourth one. It is getting much attention and hailed as important a milestone as the first one. Let’s have a look at why is that?

Fourth Industrial Revolution (4IR) is all about connectedness and how we can embed the technology into our societies and industries. All emerging technology breakthroughs will speed up the digitization of industries. Cloud technologies and 5G will be the front runners of this. A few examples of emerging fields are Robotics, Artificial Intelligence, The Internet of Things (IoT), where such innovation is happening.

4IR or Industry 4.0 is not about inventing technology for technology’s sake. The diagram below explains how 4IR is giving way to enhance connectedness and productivity and some of the social issues.


4th Industrial Revolution
  
How Integration platform will be driving the revolution

This is the age of disruptive technology. It is creating a level playing field for businesses with innovative and agile business models. Whether it is an enterprise and small business, constant innovation and integration of the business processes has become evident.

Platform with No-code/ Low-code capabilities

With No-code and Low-code development platforms, businesses can develop a software application without the need to write code. These platforms use visual modeling approach with pre-built functional components available in the platform library. The developers can just select the components and add to the workflow in line with the application. This helps developers to achieve the results with greater efficiency.

This approach also makes the progress visible to all level of people in the business. Also saves businesses a lot of money as they do not need to maintain the stand-up environment and infrastructure. The platforms pretty much offer all in the package. Let’s take a look at how No-code and Low-code development capabilities work;


No-code and Low-code


No-code development capabilities

No-code development capabilities provide the business users to solve basic functional problems between the departments with simple solutions. These platforms will be more suitable to the SMBs to solve their integration challenges without hiring a professional developer as there is no need to write a code to implement such applications.

Low-code development capabilities

Low-code capabilities cater to Enterprises that have technology governance requirements. It uses a more synchronized approach with scalable architectures and flexibility in terms of on-prem or cloud implementation. Low-code platforms also extend the capabilities with the use of open APIs. It supports complex integrations using comprehensive components in the platform library and ability to incorporate all the innovative next-generation technologies and third-party services available via open source.

This platform can create sophisticated applications for enterprises that encompasses various departments and domains. It offers better control to the developers for quality testing and performance tools which results in high productivity and speedy deployment.

API based ecosystem

With the choices and capabilities, the integration platform offers, there is no denying the power of APIs that helps these integrations possible.


API-Integration

As businesses grow bigger, requirements also grow wider. Enterprises need applications to work across departments and domains. They need to incorporate next-gen technologies to increase productivity. Then there is regulatory compliance and technology governance.

Also, there are challenges of modernizing the legacy systems to co-exist with Cloud infrastructure.

APIs act as an intermediary between on-prem and cloud, so all the other applications remain neutral to the platform they are installed on and maintain access to the data and other services as they move to the cloud.

Integration Platforms and APIs work hand in hand to help businesses integrate applications and manage data with faster deployment and higher efficiency. It saves the organization time, money, and needless to say, the errors generating from the complex coding. These platforms can drive the revolution by speeding up the productivity to many folds.

Wednesday, February 13, 2019

Handling EDI using Cloud Workflow



Electronic Data Interchange or EDI has existed since the early 70s. Today, many IT giants, such as IBM, are still using it.  One of the many reasons is its practicality, as this data format offers a way to transfer commercial documents in an easy and fast manner.

In this article, you will learn about the benefits of EDI, and how to create a collective digital model using EDI files and DBSync’s Cloud Workflow.


Benefits of EDI

EDI has been in the market for a long time. One of the reasons is its benefits.

“EDI continues to prove its major business value by lowering costs, improving speed, accuracy and business efficiency. The greatest EDI benefits often come at the strategic business level.”

EDI Basics

EDI’s most important benefits can be summarized as:



Table 1: Benefits of EDI

These benefits can be capitalized in many business cases. One of the most important of them is Supply Chain Integration.


Use Case: Supply Chain Integration

Today businesses operate in a multi-enterprise environment. An important aspect of this ecosystem is the value added through the integration of partners and customer communities.

Collaborative digital models can deliver great things, such as supplying manufacturing lead times, product availability data, demand forecasts from end customers, and more.

This data availability could, for example, allow you to consider precise inventory levels across the supply chain, based on demand data, prices and market calculations.

Thus, a B2B digital chain could benefit from EDI, and made possible with DBSync Cloud Workflow: EDI being used to describe the documents, and Cloud Workflow to execute the document transfer via FTP servers.

Once the EDI file has been transferred, a third application, usually called an EDI convertor,  is necessary in order to interpret the code contained in the EDI file, and transform it into a proper document.


Figure 2: Supply Chain Integration


How Cloud Workflow Manages the Document Transfer

There are two different operations with an EDI document. First, an EDI file can be transferred between different points, via, for example,  FTP servers. Second, the EDI document needs to be parsed and transformed into the actual document.

Cloud Workflow can easily transfer text documents stored in an FTP server to another data source, including another FTP server. The files are handled via a separator, in a similar manner to CSV files.

This feature allows Cloud Workflow to transfer EDI files. Although EDI files are defined by many different standards, such as ANSI, EDIFAC and TRADACOM, the common characteristic is that they can be transferred via a separator.

However, it must be noted that Cloud Workflow doesn’t parses the EDI document. It only transfers the text file containing the EDI code.



Figure 3: EDI transferred via Cloud Workflow

In order to understand how Cloud Workflow transfers EDI documents between FTP servers, let’s take a look at the next figure, which shows the basics of an ANSI EDI document definition.

An EDI document starts with a “Start of transaction” indicator, and ends with an “End of transaction” definition. In the middle, the contents of the document are defined.

In EDI terms, a single document is called a “transaction set” or “message”. A document can be something like an invoice, a purchase order or another commercial document. A transaction set is comprised of data elements, segments and envelopes.

Data elements are the individual items of the document, and are contained in segments, separated by a data element separator. In the example shown in the figure the separator is a star or *.

Finally, a document is stored in an envelope, which can contain one or more documents. Envelopes are then transferred between sites.

By using this EDI file structure, Cloud Workflow can transfer EDI documents located in FTP servers in a similar way it transfers CSV files.




Figure 4: Structure of an EDI document

Wrapping Up

We have seen that DBSync’s Cloud Workflow can easily transfer EDI documents stored in FTP servers. The next step is to convert the EDI code into a proper file.

This feature opens immense possibilities. One very important option is its use in Supply Chain Integration, where a chain of FTP servers could serve as a link between different points in the supply chain.

The opportunities opened by this collective digital model are many: from production based on customer data, to inventory planning, reduction of lead times, and more.

Would you like to learn more about Cloud Workflow and its powerful features? Visit our website at https://www.mydbsync.com/, where you can try our app for free.