Monday, August 6, 2018

Swagger examined (part 3) How to create a connector to your app using DBSync



— Before proceeding I’d suggest you read Part 1 and Part 2 of this article. —

In this age of integration, you need an edge. The DBSync Swagger API connector is it. If you have a Swagger compliant JSON description of your app’s RESTful API, you can use this tool to create a connector to your app, which can then serve as a bus to connect to other apps, such as Salesforce or Quickbooks, via the DBSync app.

In this tutorial you will learn how to create this kind of connector, enabling you to up your success at creating a meaningful and efficient ecosystem for your business.

Creating a Swagger API connector

Creating a Swagger API connector is simple. You just need to follow these two easy steps:

Step 1: Create a new project.

This step is required only if you don’t already have a project where you want to add the connector. In order to do so, just go to the DBSync app’s menu on the left side, and click on the Projects item. The below figure shows the results screen. Now, click on the Create New Project button, enter the name of your project and press the Save button. Your project will appear on the project’s list on the screen.



Figure 1 – Create a new project

Step 2: Create a new Swagger API connector

Once you have your project ready, select it and go to the main menu item Connectors. Once there, click on the Create New Connector button. You will see a screen asking for a name and connector type. Type in the name of your connector, and select Swagger API Connector from the dropdown menu. Then, click on the Save button. A new screen asking for the connector’s information will appear.



Figure 2 – Connectors’ item in main menu


Figure 3 – Create a new connector


 The, in the next screen (figure 4) enter the following parameters: 
  • 1.      The username and password of your account in your app.
  • 2.      Authentication type: select from the dropdown menu, among the following values:

a.      NoAuth: when no authorization is required.
b.      BasicAuth: when you are using a basic access authentication.
c.      OAuth 2.0: when you need to use the OAuth protocol.
d.      ApiCodeAuth: when the authorization mechanism is defined in your code.
  • 3.      The URL to your RESTful API.
  • 4.      Copy and paste the JSON file that documents your API.



Figure 4 – Connector’s parameters

Once all the required inputs are completed, click on the Save button. You will then see the following screen:



Figure 5 – Connector successfully saved

That is it! Your connector is ready to do the work for you.

You can test the connection by pressing the Validate Connection button, just to be sure that all is working correctly.

Final Words

In these series of three articles on Swagger you have seen the importance of Swagger as a standard for creating APIs. You have also learnt how to create a JSON file that documents your app’s RESTful service. And finally, in this third part, you have understood how to create a Swagger API connector by using DBSync.

Ready to learn more? Check out our website and start your free trial now!

Saturday, August 4, 2018

Swagger examined (part 2): The DBSync’s Swagger API



— If you haven’t already read Part 1 of this article, I strongly recommend you to do it first. —

Have you ever run into difficulty when connecting RESTful services? You are not alone. Many junior developers, as well as seasoned professionals have struggled with this problem. The great news is that the unique challenges posed by these developments, can be lessened with the use of Swagger.

This article continues the previous one with a deeper description of this open source framework, which is a must choice for serious API (Application Programming Interface) developers. Then, it moves on to explaining the basic reasons why you should use it, and how it is employed by DBSync to help you create connectors to apps of your choice, in an easy-to-follow manner. At the end, it includes a link to the next article of this series that explains you how to create customized connectors for your apps.

What is Swagger

Swagger is an open source framework consisting of standards and tools, which help you to design, build, document and consume RESTful web services.

Started as a project by Tony Tam in 2011, it grew in size and importance. At present it is sponsored by SmartBear Software, and it has been eagerly accepted by the market.

Why use Swagger

Reason 1: Add value to your products by creating an ecosystem based on common knowledge

Swagger consists of a set of rules and tools for describing APIs. Its rules standardize practices in areas such as parameter definition, path, responses and more. These tools are language agnostic, meaning that you can choose the programming language you prefer.

Having a common way helps create a system that can be easily connected to others, and that can be updated and modified without difficulty by many developers.

An example is Netflix, which is based on a microservice architecture that provided an innovative and efficient way of delivering a service at scale and in real time. As a result, the company rose to become a market leader.

Reason 2: Gain efficiency through better communication

We all know how difficult it is to translate a business idea into a programming success. One of the main challenges is communication: business, analysts, product managers and users find it difficult to understand the techies’ language.

Swagger is great in this regard, as one of its characteristics is that it is easily comprehensible for developers and non-developers alike, creating a great communication channel.

Thus, for example, product managers, analysts and even users can easily give input about the design, and developers can then translate these ideas into software.

This communication is possible not only during development, but also during testing and documentation, helping create a system that is complete, easy-to-understand and customer oriented.

Reason 3: Use a widely proven tool

Yelp is a company that handles reviews and bookings for businesses. When a company like Yelp, which deals with thousands of people making reviews relies on Swagger, you can be assured that the effectiveness of the framework has been properly proven.

With many important players – such as Netflix, Akana and Yelp – relying on Swagger for their API developments, you should have no doubt about using it.

In addition Swagger has the support of several big companies, such as Microsoft, IBM and Google, through the Open API Initiative.

Reason 4:  Develop with an open source framework

By being open source, Swagger carries many of the benefits of crowd based development, such as cost savings, flexibility, and accountability.

Open source communities have a strong value sense, which results in responsible and efficient products. In addition, many volunteers test its usage, creating a safe option. Besides, you can also be a contributor and join the ranks with giants such as Microsoft and Google by supporting Swagger’s development.

Thus, whether you are a big player with a fat budget or not, you can always benefit by choosing Swagger.

How DBSync uses Swagger
DBSync Cloud uses the main advantages of Swagger to let you create the connector you need. The only thing that you actually need is a JSON file with all the information describing your app’s API.






Figure 1 – Creating a connector
Once you have your JSON file, DBSync helps you create a connector to your app that you can use to create data transfers to other apps, such as Salesforce, Quickbooks and many more. You can even create your customized connector to any two apps, and later on connect them both via DBSync. Once you have the two apps mapped to each other, your data transfer is straightforward.




Figure 2 – Integrating applications

Thus, you can design and build an integrated ecosystem that best fulfills the needs of your business. The possibilities are almost unlimited.

Conclusion

APIs are not new, and at present, they are increasingly used to integrate apps. Basically, when you create a Swagger compliant API, you are allowing others to make use of your app. This translates into more business opportunities, more customers and better profits. Swagger, is the right technology to achieve this.

In Part 3 of this article, we will learn how to create a connector to your app with DBSync, by using your API’s JSON.

Find this exciting? Go to our website and try any of our products for free. If you have any queries, contact us, and we will contact you as soon as possible.

Friday, August 3, 2018

Swagger examined (part 1) How to document an API with incredible simplicity


Do you know what Swagger’s main characteristic is? It is its incredible simplicity: its capacity to easily and utterly describe how an API works, where it resides, what inputs it needs, and what results it produces.

It is this simplicity that helped Swagger make itself one of the most successful developmental tools available on the market.

It is the same need for efficiency that makes Swagger compliant REST services so invaluable when a new API is created.

In this article you will learn the basics on how to create a Swagger RESTful documentation based on another popular technology known as JSON.

So let’s start by learning about a great tool: the Swagger Editor

Using Swagger
There are several editing tools that can help you to create Swagger compliant APIs. One of them is the Swagger Editor, which you can use to create your API documentation. It is an open source tool that you can download or use it online.


Figure 1 – Swagger Editor

Learning JSON

If you are looking for a good reference on JSON, you can consult the standard available here. Alternatively, you can also read the RFC 7159 that is available here. Both of them present a complete description of JSON, but they are very technical.

Alternatively, if you need a rookie’s site that will teach you JSON from scratch, you can go to W3SCHOOLS.

Document description

The Swagger standard defines the structure of the JSON file that describes a RESTful web service.
Here below is a simplified description of the main sections.

Document sections

The main sections that you need to understand are: the API declaration, the paths and the definitions.

API declaration

The API declaration section describes the main aspects of an API, such as its title, version, where it is hosted, the schemes that it uses, and the MIME types it consumes and produces.
The required parts are:

  • 1.      Swagger version:  specifies the Swagger Specification version being used.
  • 2.      basePath: the root URL serving theAPI.
  • 3.      The apis or paths section: a list of the APIs exposed on this resource. There must be no more than one APIS object per path.

The consumes and produces sections list the MIME types the resource consumes and produces.
Additional information shown in the figure below, which is optional, includes the license, the host and the x-parameters. The x-parameters, such as x-response-strategy, are pairs of keys and values defined by you.


Figure 2 – API declaration

MIME types
The MIME types that can be consumed or produced are summarized in the table below. In our example we use a JSON type, but XML and XHTML are also other possible options.


Table 1 – MIME types

Paths
The path describes the different operations. The required parameters are:
  • 1.      The method required to invoke the operation. The possible values are:  GET, POST, HEAD, PUT, PATCH, DELETE, OPTIONS and TRACE.
  • 2.      Parameters: are the inputs to the operation. Each parameter must include its name and type. Other characteristics are optional.

Additional parameters shown in the below figure, which are optional, are:
  • 1.      Produces: a list of MIME types produced. In this case JSON type.
  • 2.      Responses: a description of the responses available in this operation.
  • 3.      Path parameters: between curly brackets, they must be provided by the API call.




Figure 2: paths

Definitions
The definitions or data models section contains a list of data objects. Each data object can have several parameters.
The required parameters are:
  • 1.      Name: the name of the data object.
  • 2.      Properties: the fields that define the data object. Each property has a name and a type.



Figure 3 – definitions

Where to find more information
The best reference is the official Swagger page, which is available here.

Conclusion
The Swagger standard provides a great tool to document APIs. In this article you have learnt the main components of this documentation, namely API declaration, paths and definitions or data models.
In the next article of this series you will read about the general characteristics of Swagger, its core benefits and how DBSync uses it for your businesses.
If you want to learn more about this wonderful technology, consult the Swagger page. Even more, if you want to learn how we have applied Swagger to apps integration, visit our page.

Wednesday, July 18, 2018

Discover real time integration with DBSync’s salesforce streaming connector



Have you ever felt the power of speed? That is exactly what one new Salesforce Streaming Connector brings you: real time, made real.
The connector is based on Salesforce Streaming API, which makes use of a technology that transmits data in a practically continuous manner, without waiting for the whole data set to arrive, enabling you to manipulate the data almost immediately after arrival.
Moreover, the Salesforce Streaming API helps you to focus your data transfer on the relevant data only, as it follows a publisher-subscriber model. In it, users subscribe to channels that broadcast specific data changes defined via SOQL queries. Thus, for example, a channel can be designed to inform changes in email addresses available on clients’ accounts.

What are the Benefits

An architecture with a focused real-time processing can have a huge impact on how your business functions, because you have knowledge as transactions take place.
With continuous data processing, a small amount of transactions is processed every time, making the data available as it arrives.
Figure 1
The benefits are plenty: data access is boosted, the necessary resources are reduced, and uptime is improved. These advantages can have an enormous effect on your business, by helping you outperform your competitors and increase your profits.

When to Use Real Time Integration

Continuous data processing can have a serious impact on several areas of your business. Three important ones are: an improved customer service, vendor independence and error reduction.

Improved Customer Service

Having continuous information feeds can influence the way customers respond to incentives. Their responses can be analyzed almost instantly, and incentives adjusted accordingly. Inventories can be updated timely and production efforts reduced. Customers can have receipts sent to them immediately after purchasing, and enjoy the feeling of almost instant gratification.

Vendor Independence

The capacity to continuously transmit data between different applications, frees you from being dependent on a single vendor. In this manner, you can choose the systems that you believe are the best for each different need.

Error Reduction

Continuous data updates ensure that the correct data is always available. Changes in contact information, banking details and other data is immediately reflected across your integrated systems, reducing inconsistencies and ensuring error-free timely responses.

So, why is this Technology so Efficient?

The answer is simple: the foundations behind it. The Salesforce Streaming API uses several proven technologies that combined, optimize the data transfer.
The asynchronous communication is based on the widely accepted Bayeux protocol and Ajax methodology, ensuring a persistent client-server connection, where events are incrementally handled and interpreted on the client side immediately on arrival.
Long polling or Comet Programming minimizes the amount of messages sent, by waiting until data is available to be delivered, eliminating no-data messages.
Adding to these technologies, DBSync user friendly interface ensures that the integration process is done in a fast and easy manner, permitting you to self-design your integration processes.

Concluding

We have seen that real-time integration can have a huge impact on your business. Customer needs are swiftly serviced, business performance information is current, and production and delivery are focused towards your business’ most important needs.
This continuous integrated data flow can be easily achieved with DBSync’s Salesforce Streaming Connector, enabling you to create the real time architecture that optimizes your business.
Find this exciting? Go to our website (https://www.mydbsync.com ) and try it for free. If you have any questions, contact us, and one of our representatives will get in touch with you.

Monday, July 16, 2018

Data archival solutions with cloud replication



Are you facing data storage regulatory compliance problems? Fortunately, data archiving is one of those areas where regulatory compliance can be made easy and effective if you have the right archival solutions.
But first, let’s understand what data archival is all about, and why it is different from the normal backups that you are doing.

What is Enterprise Data Archival

Data archival is the storage of all the data that you need to keep because it may be occasionally useful, but that it is no longer used in everyday operations.
Good data archival practices have many benefits, such as reducing the cost of primary storage, and as a direct consequence the cost of backup storage; having good original proofs in cases of legal problems; and very importantly, helping you avoid worrying about regulatory compliance.
Let us now examine some of the most important law requirements.

Data Archival

Data archival regulatory compliance relates to data that is still important and that maybe needed in the future, particularly in case of litigations. The following are the most important regulations that may affect your business’ data archival:

SOX (Sarbanes-Oxley Act):

This act was passed after the Enron scandal, and it is focused on financial reporting. It applies to any company that is listed on the NASDAQ or the NYSE. Its main concern is to implement controls that ensure the completeness, correctness and quick access of information. Within this law, some businesses are specifically targeted. For example, accounting firms that audit publicly traded companies are required to retain audit records for no less than seven years after the completion of an audit.

PCI DSS (Payment Card Industry Data Security Standard)

This is a standard that businesses using online payments are required to comply with, particularly if they use credit and debit cards, such as Visa or Mastercard.
The standard stresses information access permissions and, it is based on the Principle of least privilege. This principle basically says that access to information should only be granted to those who necessarily need it, thus, reducing the risk for the data to be compromised. In addition, it requires the use of encryption.

FINRA (Financial Industry Regulatory Authority)

Basically, FINRA aims at auditing banks and financial institutions, to ensure their good behavior. Among the most important requirements, FINRA 3110 entails that you preserve your accounts, records, memoranda, books and correspondence in conformity with all applicable regulations, statements, and rules under SEC 17a-3. Several other parts of this act, also relate to ensure SEC compliance.

GDPR

Although you may think that if you are not in Europe, the General Data Protection Regulation (GDPR) does not apply to your business, you must be aware that it obliges any organization processing personal data of EU citizens, such as name, IP address, location, religion and ethnicity among others. Thus, if your company deals with clients in the EU, be sure that you abide to the GDPR requirements as of May 2018.
This regulation basically aims at protecting people’s information and presents new challenges for businesses. For example, Article 20 of the GDPR provides the Right of Data Portability. It basically says: if you’ve got some data about me, then I’m allowed to access it when I need it and to require you to provide it to anyone I choose. This applies to current and historical data.
Whilst regulatory compliance implies many efforts from your side, it can also be an opportunity to organize your data. The good news is that the pain implied can be reduced by having a sound reference architecture, and good data archival solutions.

Enterprise Data Archival: Introducing DBSync Cloud Replication and CDM

DBSync Cloud Replication and CDM is an application that helps you to concentrate on your business, not on compliance. DBSync assists you with this, by automating many of the tasks, increasing archival efficiency and reducing error generation; while keeping your business secure and cost-effective. Some of its most important features include:
·         The possibility to auto-create a schema and new fields by synchronizing an application to a database.
·         The possibility of having real time integration via outbound messages.
·         The capacity to obtain data from an application, such as Salesforce, and a database. Then, merge the data, use it according to your needs, and load the results obtained back to the application.

Final words

Improving data management has become critical in our present days. To learn more about getting your organization’s data archival up to speed, go to our website (https://www.mydbsync.com ), where you can find more information about our data archival solution.

Friday, July 13, 2018

Why data replication should not be done using ESB-based integration tools



This is one of the common questions we get when prospects come looking for data replication tools. It’s more a question of Integration design patterns than of product implements.

Let’s get started with what an ESB is – Enterprise Service Bus. This is an integration design pattern where messages are passed so that one or more Message Listeners can listen and consume the message – store and forward. These messages—like, say, emails—have a header (from and to), a payload (the message), and perhaps attachments. Based on the ESB, there might be some limitation on payload and attachments sizes. 

The Flow is like This: 

App produces message -> ESB receives message (in a queue) -> Based on Routing rules, ESB routes message -> Listener Consumes Message -> Likely maps/translates data -> Saves / forwards to another app/queue -> confirm message is received -> ESB tags and stores message as processed.

Notice the ones in “bold”. These are places where data flow can “choke” or “build up” if there is a high flow of large data sets.

Now look at Data Replication: you have a source of data, be it a database (common) or a Cloud Application (like Salesforce). In data replication, you would require a complete backup of both schema and data changes. The application is expected to identify schema changes and update to target (without the need for remapping), so interpreting schema changes and having the ability to adjust target schema changes becomes important. The ability to process a large number of rows is necessary. 
One of the common ways that most databases replicate is using their transactional logs (when you look under the hood of master–slave replication). When you have disparate applications like Salesforce and Oracle, then you have to rely on query-> extract -> interpret change -> check for target source duplicate -> load on another system.

Ok, so let’s now look at why ESB-based apps might not be the right choice:

  • ·         ESB requires store and forward, which might not be necessary for data replication. While you can debate that it will work (yes you can make it work), it will be slow and overly complicated.
  • ·         ESB in general is considered to have the higher overhead of operation management and requires higher uptime as it’s mostly used for distributed app integration. Replication usually is run on batch (or scheduled time) or, in the case of master–slave, a lot more real time than what ESBs are designed for.
  • ·         Managing schema changes often requires ESBs to remap some of the message flows. Some of our clients really dislike this, in that not only do they have to track source and target schemas, but also often trigger a “Change Management” request up the IT chain, which can take weeks or months to get over. Data Replication tools usually automatically adjust target schemas.

When you look at the Integration Tools market, the industry has segmented itself, with one group going the ESB or Message queue route (which is slowly evolving into API-based integration) and that of data replication.

So let us see some of the common integration apps and how do they fit in:

MuleSoft- A leader in ESB-based integration and does quite well in Service-oriented architecture and does well in integrating apps like SAP and others. They are also introducing API management.

Kafka- Open-source Messaging platform, very popular in high-volume messaging, especially with IoT and big data. It requires smaller messages size.

GoldenGate (by Oracle) – a leader in data replication between different databases. Does not yet have Cloud application data replication.

DBSync – Cloud Data Replication uses direct replication technique while iPaaS Cloud Workflow is more a store and forward.

There are many more; perhaps a good place to look is Gartner’s Data Integration and Gartner’s Integration-as-a-Service magic Quadrants to see which are leading the pack.

Wednesday, April 11, 2018

How to transfer data from Microsoft dynamics 365 for sales to a database using dbsync



Cloud data and cloud based applications have grown exponentially over the past years. Combining data sources and applications has become a major challenge for many companies. One of these important applications is Microsoft Dynamics 365 for Sales.
Microsoft Dynamics 365 for Sales is an application that helps companies to interact with current and potential customers. It is widely used due to its convenient characteristics.
Often the data stored in Microsoft Dynamics 365 for Sales has to be transferred to a database such as MSSQL or MySQL.
In this article, you will learn how to do this by using DBSync Cloud Workflow.

Introducing DBSync Cloud Workforce

DBSync is an integration platform that facilitates the integration between different systems. It manages all the communication, data translation and process interaction among the various connected applications.
This platform uses different processes, also known as business processes, to synchronize data between different databases and applications. These processes, in turn, use connectors to link different databases and applications.
DBSync presents you with a series of connectors. Among them, is the Microsoft Dynamics CRM connector, which has two versions, namely online and on premise.
These connectors allow you to move data in and out of Microsoft Dynamics 365 for Sales, in order to populate data warehouses, synchronize master databases, and streamline transaction details and online payments.

How to transfer from MS Dynamics 365 for Sales to a database

You will now see the necessary steps to transfer data from MS Dynamics 365 for Sales to MySQL. However, the same steps apply to other databases, such as MSSQL, Oracle, and PostgreSQL.

Step 1: Create a new project.

Here you are presented with two options: importing an existing project or defining a new one.
In order to create a new project, you must press the button Create New Project. Then, you will be prompted to enter a name for your project and save it.


Figure 1 – Create a new project

Step 2: Create a connector for each system that you intend to work with.

As we you connecting Microsoft Dynamics 365 for Sales to a database, you need to create two connectors, one for each resource. One connector will be of type Database Connector and the other of type MSCRM Online Connector.


Figure 2: Create a connector

Step 3: Create a process.

Once you are connected to the source and the destination, you need to create a process that defines the transfer task. For this, you must press the button Create New Process and input a name. Just remember that process names cannot contain spaces.


Figure 3: Create a process

Step 4: Create a workflow.

Each process may have several different workflows. In order to create a new workflow, you must press the Create New Workflow button. The system will ask for a workflow name. Just remember that the workflow name mustn’t contain any empty spaces. Each workflow will contain a trigger and one or more rules.


Figure 4: Create a workflow

Step 5: Configure a trigger.

A trigger lets you define a query to your data source, in our case Microsoft Dynamics 365 for Sales. You are presented with two options: query builder and advanced query builder. These alternatives give you the freedom to handle a query at different levels of sophistication, and according to your level of skills.

Step 6: Create a rule.

You can have one or more rules for each workflow. Each rule defines the data destination and a mapping between source and destination.


Figure 5 – Trigger and Rule creation

Step 7: Define a map:

A map connects source and target fields. With DBSync, the mapping is very easy as you can connect the source-destination field pairs by dragging and dropping.



Figure 6 – Create a map

Step 8: Run the process.

You can do it manually or via a scheduler. Manual runs are useful for sporadic tasks (See figure 3). The scheduler lets you manage repeated runs, or run the process at a specific date and time.

What follows

In this article, you have learned how to connect Microsoft Dynamics 365 for Sales and a database. This is just one of the many uses of DBSync Cloud Workflow. There are many more for you to explore. Ready to try? Go to the DBSync website (http://www.mydbsync.com/) and sign up for a free trial.