The Reality of a Developer’s Life

I can’t say I am too proud of this post since it is mostly gifs that I found around the web and wanted to share with everyone.

In any case, I hope you will laugh a bit and if you come across any other related gifs, please let me know!

When you upload something to the production environment:

When you find a problem solution without searching in Google:

When you close your IDE without saving the code:

When you try to fix a bug at 3 AM:

When your regular expression returns what you expect:

When my boss reported me that the module I have been working will never be used:

When I show to my boss that I have fixed a bug:

When I upload a code without tests and it works as expected:

When marketing folks show to developers what they have sold:

The first time you apply a CSS to a web page:

When the sysadmin gives you root access:

When you run your script the first time after several hours working on it:

When you go on the weekend and everyone else is at the office trying to fix all issues:

When your boss finds someone to fix a critical bug:

When you receive an extra paid if the project ends before the deadline:

When something that had worked on Friday and on Monday did not work:

When you develop without specifications:

When the boss tells me that ‘tests are for those who don’t know how to code’:

When I show the boss that I have finally fixed this bug

A Developer's life in GIF

When my project manager enters the office

A Developer's life in GIF

#BigData innovation through #CloudComputing:

Overview:

With the digitalization of almost everything in this world, the amount of data is increasing at an exponential rate. The IT experts soon realized that analysis of this data is not possible with the traditional data analysis tools. Considering this ever-expanding volume of useful data that could be used in a number of ways the IT experts came up with many solutions amongst which the two initiatives are amongst the top. These two are big data and cloud computing.

Big data analysis offers the promise of providing valuable insights of the data that can create competitive advantage, spark new innovations, and drive increased revenues. By carefully analyzing the data we can predict different things about the company. Cloud computing acts as a delivery model for IT services of any company and has the potential to enhance business agility and productivity while enabling greater efficiencies and reducing costs significantly. By storing the data on cloud servers instead of on site IT department you can not only save money but also make sure that your data is safe and secure as the security of these cloud servers is usually in the hands of top IT security companies.

Both technologies continue to thrive. Organizations are now moving beyond questions of what and how to store big data to addressing how to derive meaningful analytics that responds to real business needs. As cloud computing continues to mature, a growing number of enterprises are building efficient cloud environments, and cloud providers continue to expand services and service offerings.

Characteristics and Categories:

Databases for big data:

One of the most important and crucial task that any company has to do is to choose the correct data base for their big data. As the data is increasing more and more companies have emerged to provide data bases for this big. The databases that are designed to handle big data are usually referred to as NoSQL systems and they do not depend on SQL in contrast to the traditional SQL based data systems. The main working principle of all these companies is, however, the same that is to provide an efficient and effective storage to companies and give them ways to extract useful information from their big data. These companies truly help them to build and expand their business by giving them useful data analytics. The most reputed companies among hundreds of others are Cassandra, dynamob, and AWS. These companies not only give you the best data storage options they also make sure that your data is safe and secure and provide you with useful analytics about your data.

Machine Learning in the Cloud:

One of the most interesting features of cloud computing and big data analysis is the machine learning and its integration with AI. The machine learning cloud services make it easier to build sophisticated and large-scale models that can really increase the efficiency and enhance the overall data management of your company’s data. By injecting AI into your business, you can learn truly amazing things about the data analytics.

IoT platforms:

Internet of Things or IoT is also an interesting aspect of big data and cloud computing. Big data and IoT are essentially two sides of the same coin. Big data is more about data whereas IoT is more concerned with the flow of this data and connectivity of different data generating devices. IoT has created a big data flux that must be analyzed in order to get useful analytics from it.

Computation Engines:

Big data is not just about collecting and storing a large amount of data. This data is of no use to us until it gives us useful information and analytics. These computational engines provide excellent scalability to make your data storage more efficient. These engines use parallel and distributed algorithms to analyze the data. Map reduce is one of the best computations engines in the market at the moment.

Big Data on AWS:

Amazon’s AWS provides you one of the most complete and best big data platforms in the world. It provides you a wide variety of options and different services which can help you with your big data needs. With AWS, you get fast and flexible IT solutions and that too at a low cost. It has the ability to process and analyze any type of data regardless of the volume, velocity, and variety of data. The best thing about AWS is that it offers you more than 50 services and hundreds of features are added in these services every year constantly increasing the efficacy of the system. Two of the most famous services offered by AWS is redshift and kinesis.

AWS Redshift:

Amazon Redshift is a fast, efficient and fully managed data warehouse that makes it extremely simple and cost-effective to analyze all your data using standard SQL and your existing Business Intelligence (BI) tools. By allowing you to run complex analytic queries against petabytes of structured data and using sophisticated query optimization on high-performance local disks most results come back in seconds. It is also extremely cost efficient where you can start from as small as $0.25 per hour with no commitments and then gradually increase to petabytes of data for $1,000 per terabyte per year.

The service also includes Redshift Spectrum, which allows you to directly run SQL queries against exabytes of unstructured big data in Amazon S3. You don’t need to load or transform the data, and you can use open data formats which may include CSV, TSV, Parquet, Sequence, and RCFile. The best thing is that Redshift Spectrum automatically scales query and computes capacity based on the data being retrieved, so queries against Amazon S3 run fast and do not depend on data set size.

AWS Kinesis:

Amazon Kinesis Analytics is another great service by Amazon and is one of the easiest ways to process streaming data in real time with standard SQL. The best thing about this service is that you don’t have to learn any new programming languages or processing frameworks. This service allows you to query streaming data or build entire streaming applications using SQL. This makes sure that you can gain actionable insights and respond to your business and more importantly customer needs promptly.

Amazon Kinesis Analytics is a complete service that takes care of everything required to run your queries continuously and the best part is that it scales automatically to match the volume and throughput rate of your incoming data. With Amazon Kinesis Analytics, you only pay for the resources your queries consume which makes it extremely budget friendly and cost efficient. There is no minimum fee or setup cost.

 

Scaling #Python on Heroku: Deployment, part 1

It’s always good for a developer to have a couple of different deployment options under their belt. Why not try deploying your site to Heroku, as well as PythonAnywhere?

Heroku is also free for small applications that don’t have too many visitors, but it’s a bit more tricky to get deployed.

We will be following this tutorial: https://devcenter.heroku.com/articles/getting-started-with-django, but we pasted it here so it’s easier for you.

The requirements.txt file

If you didn’t create one before, we need to create a requirements.txt file to tell Heroku what Python packages need to be installed on our server.

But first, Heroku needs us to install a few new packages. Go to your console with virtualenv activated and type this:

(myvenv) $ pip install dj-database-url gunicorn whitenoise

After the installation is finished, go to the apilama directory and run this command:

(myvenv) $ pip freeze > requirements.txt

This will create a file called requirements.txt with a list of your installed packages (i.e. Python libraries that you are using, for example Django :)).

: pip freeze outputs a list of all the Python libraries installed in your virtualenv, and the> takes the output of pip freeze and puts it into a file. Try running pip freeze without the > requirements.txt to see what happens!

Open this file and add the following line at the bottom:

psycopg2==2.5.4

This line is needed for your application to work on Heroku.

Procfile

Another thing Heroku wants is a Procfile. This tells Heroku which commands to run in order to start our website. Open up your code editor, create a file called Procfile in apilama directory and add this line:

web: gunicorn mysite.wsgi

This line means that we’re going to be deploying a web application, and we’ll do that by running the command gunicorn mysite.wsgi (gunicorn is a program that’s like a more powerful version of Django’s runserver command).

Then save it. Done!

The runtime.txt file

We also need to tell Heroku which Python version we want to use. This is done by creating aruntime.txt in the apilama directory using your editor’s “new file” command, and putting the following text (and nothing else!) inside:

python-3.4.2

mysite/local_settings.py

Because it’s more restrictive than PythonAnywhere, Heroku wants to use different settings from the ones we use on our locally (on our computer). Heroku wants to use Postgres while we use SQLite for example. That’s why we need to create a separate file for settings that will only be available for our local environment.

Go ahead and create mysite/local_settings.py file. It should contain your DATABASE setup from yourmysite/settings.py file. Just like that:

import os
BASE_DIR = os.path.dirname(os.path.dirname(__file__))

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    }
}

DEBUG = True

Then just save it! 🙂

mysite/settings.py

Another thing we need to do is modify our website’s settings.py file. Open mysite/settings.py in your editor and add the following lines at the end of the file:

import dj_database_url
DATABASES['default'] = dj_database_url.config()

SECURE_PROXY_SSL_HEADER = ('HTTP_X_FORWARDED_PROTO', 'https')

ALLOWED_HOSTS = ['*']

DEBUG = False

try:
    from .local_settings import *
except ImportError:
    pass

It’ll do necessary configuration for Heroku and also it’ll import all of your local settings ifmysite/local_settings.py exists.

Then save the file.

mysite/wsgi.py

Open the mysite/wsgi.py file and add these lines at the end:

from whitenoise.django import DjangoWhiteNoise
application = DjangoWhiteNoise(application)

All right!

Heroku account

You need to install your Heroku toolbelt which you can find here (you can skip the installation if you’ve already installed it during setup): https://toolbelt.heroku.com/

When running the Heroku toolbelt installation program on Windows make sure to choose “Custom Installation” when being asked which components to install. In the list of components that shows up after that please additionally check the checkbox in front of “Git and SSH”.

On Windows you also must run the following command to add Git and SSH to your command prompt’s PATH: setx PATH "%PATH%;C:\Program Files\Git\bin". Restart the command prompt program afterwards to enable the change.

After restarting your command prompt, don’t forget to go to your apilama folder again and activate your virtualenv! (Hint: Check the Django installation chapter)

Please also create a free Heroku account here: https://id.heroku.com/signup/www-home-top

Then authenticate your Heroku account on your computer by running this command:

$ heroku login

In case you don’t have an SSH key this command will automatically create one. SSH keys are required to push code to the Heroku.

Git commit

Heroku uses git for its deployments. Unlike PythonAnywhere, you can push to Heroku directly, without going via Github. But we need to tweak a couple of things first.

Open the file named .gitignore in your apilama directory and add local_settings.py to it. We want git to ignore local_settings, so it stays on our local computer and doesn’t end up on Heroku.

*.pyc
db.sqlite3
myvenv
__pycache__
local_settings.py

And we commit our changes

$ git status
$ git add -A .
$ git commit -m "additional files and changes for Heroku"

Pick an application name

We’ll be making your blog available on the Web at [your blog's name].herokuapp.com, so we need to choose a name that nobody else has taken. This name doesn’t need to be related to the Django blogapp or to mysite or anything we’ve created so far. The name can be anything you want, but Heroku is quite strict as to what characters you can use: you’re only allowed to use simple lowercase letters (no capital letters or accents), numbers, and dashes (-).

Once you’ve thought of a name (maybe something with your name or nickname in it), run this command, replacing apilamablog with your own application name:

$ heroku create apilamablog

: Remember to replace apilamablog with the name of your application on Heroku.

If you can’t think of a name, you can instead run

$ heroku create

and Heroku will pick an unused name for you (probably something like enigmatic-cove-2527).

If you ever feel like changing the name of your Heroku application, you can do so at any time with this command (replace the-new-name with the new name you want to use):

$ heroku apps:rename the-new-name

: Remember that after you change your application’s name, you’ll need to visit [the-new-name].herokuapp.com to see your site.

Deploy to Heroku!

That was a lot of configuration and installing, right? But you only need to do that once! Now you can deploy!

When you ran heroku create, it automatically added the Heroku remote for our app to our repository. Now we can do a simple git push to deploy our application:

$ git push heroku master

: This will probably produce a lot of output the first time you run it, as Heroku compiles and installs psycopg. You’ll know it’s succeeded if you see something likehttps://yourapplicationname.herokuapp.com/ deployed to Heroku near the end of the output.

Visit your application

You’ve deployed your code to Heroku, and specified the process types in a Procfile (we chose a webprocess type earlier). We can now tell Heroku to start this web process.

To do that, run the following command:

$ heroku ps:scale web=1

This tells Heroku to run just one instance of our web process. Since our blog application is quite simple, we don’t need too much power and so it’s fine to run just one process. It’s possible to ask Heroku to run more processes (by the way, Heroku calls these processes “Dynos” so don’t be surprised if you see this term) but it will no longer be free.

We can now visit the app in our browser with heroku open.

$ heroku open

: you will see an error page! We’ll talk about that in a minute.

This will open a url like https://apilama.herokuapp.com/ in your browser, and at the moment you will probably see an error page.

The error you saw was because we when we deployed to Heroku, we created a new database and it’s empty. We need to run the migrate and createsuperuser commands, just like we did on PythonAnywhere. This time, they come via a special command-line on our own computer, heroku run:

$ heroku run python manage.py migrate

$ heroku run python manage.py createsuperuser

The command prompt will ask you to choose a username and a password again. These will be your login details on your live website’s admin page.

Refresh it in your browser, and there you go! You now know how to deploy to two different hosting platforms. Pick your favourite 🙂

Artificial Intelligence Offerings as an #API

Artificial intelligence or for short “AI”, is the use of intelligent machines that react and work like the human mind. This area of computer science is mainly concerned with speech recognition, processing, planning, learning, and problem-solving. On the surface, artificial intelligence may be linked to robotics, as it is mostly portrayed as such in Sci-fi movies, but the concept of artificial intelligence is much more complex than that. Artificial intelligence is now capable of much more than you think, it can provide you with reasoning, just like a human would, it can correct itself (self-correction) and it can learn and adapt, most programs are fixed when evaluated in terms of the duties they perform as their codes bound them to do so, artificial intelligence differs from the traditional methods in this department.

Picture1.pngThe use of Artificial Intelligence is very common, ranging from the top tech businesses to an average person just using his phone or laptop. The term originated in 1956. Today, it holds greater meaning than ever, it stretches from robotic processes such as automation to actual robotics itself! Artificial intelligence has all the abilities that a technical machine should, from speed to accuracy, all while being extremely human-like. AI can identify patterns, process data more efficiently than a human would, making it essential for businesses to have in order to progress.

As we have concluded, AI is a broad term and is not limited to a concise definition. Artificial intelligence holds greater depths even in one’s daily life, Siri a virtual assistant, that can perform a wide range of tasks, from looking up recipes to booking a flight. This type of Artificial Intelligence is working as an API to get the desired results you expect from it. API refers to Application Programming Interface, which acts as a channel between the user and the service provider. In its most basic terms consider the example of your virtual assistant Siri, on your command it acts as an API (application programming interface) to access a different database, such as calling an Uber to your doorstep.

APIs and Applications of AI as an API:

As previously stated, API stands for Application Programming Interface, that provides a platform for a set of routines and tools for building software applications, it also specifies how the different software can interact with one another. Cortana, an assistant made by Microsoft allows you to make reservations at a restaurant by acting as an API, this example highlights the use of a simple API and AI altogether.

Many other such interactions can be observed, Google Maps API permits developers to Picture2.pngembed the web page of google maps using either JavaScript or a Flash Interface. By using Siri to locate a road for you connects this process as a whole. Siri, being a form of artificial intelligence and acting as an application programming interface. Even Tesla, a self-driving car uses Google maps as a basic platform to use its self-driving capabilities.

 

 

Use of Artificial Intelligence as an API in Businesses:

Many firms have switched to using a superior functioning artificial intelligence system instead of using their old traditional information technology methods based operating systems.
Many tedious everyday tasks are now being performed by Artificial intelligence based operating systems, freeing up human resources that can better invest their time in projects that will be beneficial for the company. Many Customer Relationship Management systems are now using Artificial Intelligence by using machine learning algorithms to discover information on how to better communicate with customers, on calling the customer is immediately connected to an AI based operator, that deals with the concerns the customer has, more efficiently than any human operator would have.

Two-way communication with ChatBots, ChatBots are using artificial intelligence to engage the customer in a conversation, such pop-ups, ask the customer what they are concerned with and display the information that is only relevant for them. ChatBots ensure better two-way communication and help in promoting consumer loyalty. Companies rely on artificial intelligence to handle such matters, more efficiently and professionally than their human counterparts.

Picture3.png

The Dawn of Artificial Intelligence; from IBM to AWS:

International Business Machines or IBM: International business machines or IBM is a platform that previously provided hardware, but is now dealing in the software department, that deal with cognitive computing, a branch very similar to artificial intelligence. The research dates back to 1950, IBM provides server hardware, storage, software, cloud and many cognitive offerings.

IBM Watson is the ultimate offering of IBM for AI and Big Data with tones of applications. It was introduced several years ago, and since then has become on of the most powerful enterpirse APIs out there.

Amazon Web Service Artificial Intelligence or AWS AI: Amazon web service provides you with instances, to optimize your applications with the uses of the provided instance, either upgrade or to enhance performance. AWS enhances the performance of your drive and gives a variety of services targeted for enterprise AI usage. It is also amazing and we have to mention that amazon sustains a blog regarding AI.

Intrigued? Create your own Artificial Intelligence based Program with APIs:

Artificial Intelligence surrounds us, from Tesla’s self-driving cars to Siri on your iPhone, Artificial intelligence comes to play even when we are operating a system to get the smallest amount of output. Cortana, Siri, Tesla, Cogito or our favorite platform to watch movies and series, Netflix, are all examples of artificial intelligence and it’s easy to get influenced by them.

In this technological era, nothing seems to be impossible, one can create his own form of Artificial intelligence by using an Application Programming Interface to make your own custom software.

Using API.ai: A service that allows you to transform speech into text messages, it allows you to naturally process a language along with an Artificial intelligence system that will cater for your every need.

Step1: Login to their site and allow the program to access the basic data of your account. Accede to their terms and conditions and begin by creating your own artificial Intelligence based virtual assistant.

Step2: Authorize the access to basic information then customize your AI assistant by adding in some standard information, this information includes their Name, Description (what you intend your agent to be), language (the language your agent will be operating in) and the time zone.

Step3: The Test Console, allows you to test out the basic operations performed by your agent. It allows you to enter queries and how your agent will respond to them. Adding an additional small talk is based on your preferences, and you can do so by clicking the enable button.

Step4: Save the changes you have made and find your Artificial intelligence based assistants’ API.ai API keys. Feel free to make additional changes if you please then use JavaScript to connect to the api.ai.

Step5: Use HTML5 speech recognition to get on the right track, communicate with the api.ai and host your web interface lastly but not the least, say “hello” to your Artificial Intelligence using, the state of the arts Virtual Assistant!

Making HTTP Requests in JavaScript

I really liked this detailed guide to consuming APIs through Javascript. No matter if you are using ReactJS or Angular or any other frontend framework, this is basic knowledge that you have to deal with at some point. Rahul leads by example here. A must read in my opinion.

CAP'n Tech

The Introduction :

As you probably know very well by now, the internet is made up of a bunch of interconnected computers called servers. When you are surfing the web and navigating between web pages, our browser  requests information from any of these servers. The chart below explains explicitly on the request.

46

That is, our browser sends a request, waits for the server to respond to the request, and (once the server responds) processes the request. All of this is governed by protocols or rules which is a topic for another day.

Application Program Interface (API) 

Now the Wikipedia definition of the API will tell you, an Application Programming Interface (API) is a set of subroutine definitions, protocols, and tools for building 47.pngapplication software. But in layman terms , in the context of the web, the API’s generally allow you to send commands to programs running on the servers that…

View original post 640 more words

APIs for Authentication: A journey

Application Program Interface (API) key authentication is a technique that overcomes the hurdles of using shared credentials by using a unique key for each user. The key is usually in the form of a long series of letters and numbers that are different from the account login password. The owner provides the client with the key that helps the client access a website. When a client provides the said API key, the server allows the client to access data. The server has the power to limit administrative functions to any client for example in changing passwords, or deletion of accounts. API keys are sometimes used so that account passwords do not have to be given again and again. The APIs offer flexibility to limit control while also protecting user passwords.

API keys work a lot of different ways as they were conceived by multiple companies and they all have a different way of authentication. There are some API keys like Basic Auth that uses an established standard along with some strict rules. However, over time some familiar approaches are being used. These include putting the key in the Authorization header accompanying the username and password, another one just demands to add the key onto the URL. Sometimes keys are buried in the request body together with the data. Wherever the key is added the outcome is the same, the server provides access to the user.

There are different security protocols being used like OAuth1.0a, Basic API authentication with TLS and OAuth2.0. Basic Auth is the simplest because it only uses the standard framework or language library. Because it is the simplest hence, it offers the least security and provides no advanced options, you are simply providing a username and password that is Base64 encoded.

OAuth1.0a has, on the other hand, the most secure security protocol as it uses a cryptographic signature, combined with a token secret, none and other request based information. As the token is never directly passed across the wire so there is no possibility of anyone seeing a password in transit, this provides an edge to OAuth1.0a. On the other hand, this level of security comes with a lot of complexity. You have to use hashing algorithms with strict steps, but now this problem has been overcome as every other programming language can do it for you.

Repose is another API authentication platform that provides open source API validation, HTTP request logging, rate limiting and much more. It employs a RESTful middleware platform that is easily scalable and extensible. OAuth2.0 and Auth0 are both open sources API authenticators. Both have a completely different approach from OAuth1.0a. The encryptions are handled by TLS (previously called SSL) rather than using cryptographic algorithms. There are not that many OAuth2.0 libraries so this provides a disadvantage to users. OAuth2.0 is used by big names like Google and Twitter.

Auth0 is a platform that allows authentication of apps and supports just about all identity providers on any device or cloud. It uses a secure HTTPs API key to integrating with other tools giving it a seamless experience. It provides the clients with the ability to authenticate with credentials that they are comfortable with.

Many management platforms for API are available, each platform bringing something unique on the table. Kong is an API manager that offers a range of plugins to improve security, better authentication services and management of inbound and outbound traffic. Kong acts as a gateway between the client and the API, providing different layers of rate limiting, logging, and authentication.

3Scale is another manager that separates traffic control and management layers, as a result it produces superior and unsurpassed scalability. It integrates many gateway deployments with Amazon, Heroku, and Red HatOpenshift, which are free to use. Additionally, plugins can also be added to libraries built in several different languages and they design custom API management protocols for organizations as well. Microsoft Azure also provides a host of options for users so that little effort is done on the client’s part and most of the work is accomplished by managers. Azure uses a professional front end and developer portal that make it more user-friendly. It offers the greatest number of options for APIs and thus attracts more clients.

Del Boomi can be thought of as a cloud middleware, plumbing between applications that reside on cloud or premise. They can efficiently manage data for social networks and other uses. Boomi communicates with data across different or common domains, giving it an added advantage. MuleSoft is another API manager that makes use of Anypoint platform, thus it re-architects the SOA infrastructure covering legacy systems, proprietary platforms, and custom integration. This results in a strong and agile business solution for their clients.

AWS cognito is another management system offered by Amazon web services. They offer an adaptive multi-layer design that includes products which ensure availability and resilience. AWS cognito is built with security as its key feature. It can be easily deployed on any platform, using lock library or custom build implementation that can be chosen from more than 50 integrations. It enables clients to authorize users through an external identity provider that assigns temporary security credentials for users to access your website/app. It employs external identity providers that support OpenID, SAML, and the option to integrate your own identity provider.

Recently, API has found its applications in health-related fields. A vast majority of healthcare providers and other companies in the healthcare industry are making use of the web and mobile services. They provide vital information to patients and help them share information with other prescribers. Medical APIs will also help with the integration between partner providers, patient support services, insurance companies and government agencies. But are these API’s HIPPA compliant is a question many users have. Yes, there are many providers that meet the challenge of conforming to client demands while also ensuring the security of medical data.

Apigee Edge, another platform enhances digital value chain from the back end for customers who engage an app. It is HIPPA (Health Insurance Portability and Accountability Act) and PCI compliant. Apigee maintains management compliance by a number of features that include, encrypting and masking information, protecting traffic and managing and securing all data.

For healthcare providers, there are other API managers that provide HIPPA compliance like TrueVault. TrueVault acts as an interface between internal data and external applications. For instance, if a diagnostic laboratory wants to provide online viewing of test results, by making use of TrueVault they can allow approved third parties to access that information without the use of custom APIs or hooks. Hence, it provides a secure service that not only saves time but delivers information to the patients via mobile and tablet interfaces.

Still, there are many challenges that API managers face in making optimized solutions for the healthcare sector. Lack of access to effective tools required for testing and monitoring these interfaces are a serious obstacle for the developers. Furthermore, the developers lack insight and feedbacks in medical APIs which is a critical factor in developing elaborate and engaging APIs that will be widely adopted by the medical field.


Related Links:

  1. Apigee management compliance.

https://apigee.com/about/cp/api-management-compliance

  1. MuleSoft API manager

https://www.mulesoft.com/

  1. TrueVault Systems

https://www.truevault.com/healthcare-api.html

  1. Microsoft Azure

https://azure.microsoft.com/en-us/resources/videos/azure-api-management-overview/

  1. Del Boomi

https://marketing.boomi.com/API-Management-Demo-Success.html

  1. Kong API manager

https://getkong.org/

https://getkong.org/plugins/oauth2-authentication/

  1. 3Scale management

https://www.3scale.net/technical-overview/

  1. Akana API management solutions

https://www.akana.com/solutions/api-management

  1. Auth0

https://auth0.com/opensource

  1. Repose API manager

http://www.openrepose.org/

  1. OAuth2.0

https://oauth.net/2/

  1. OAuth1.0a

https://oauth.net/core/1.0a/

Mega Cities of Today and Tomorrow

I have been talking a little bit about Smart Cities since the very beginning of this blog.
Smart Cities for me are just a huge playground for APIs, Cloud Computing and more or less all the technologies that we utilize and promote nowadays.
Today there are so many huge, enormous, mega cities and the urgency for “smart solutions” grows bigger and bigger.
In the following visualizations, it is clearly depicted how those needs can be handled and why there are such needs in the first place.

BLACK BOX PARADOX

A bit of analytics on mega cities of today:

guides

People will continue moving into cities in the future:

View original post 20 more words

Scrape a Webpage using Python 2.7

I have been drafting a similar article for quite some time, but then my friend Konstantinos posted this. I just loved it..
The simplicity, the straight points he makes and obviously the hands-on tutorial. I just hope that you will enjoy it as much as I did!

My Data Mining

Similar to the previous post, in this post, we are going to learn how to extract information from the Internet. We have to create a dataset first, to implement data mining techniques. So, let’s start.

Github Code of this project.

1. What is scraping?

Scraping is a technique that allows us to extract information from the Internet. For example, scraping a web page means that we are going to extract the HTML from that page and then take the ‘useful’ information from the HTML. Useful information is the information that we need, for example, the infobox of a Wikipedia page or the meta tags of a web page, etc. For more information, you can check the definition of Web Scraping.

2. Scraping a Webpage

For this project we are going to need the following packages:

Like before, I am going to build the project as…

View original post 823 more words

Build Your Own Udemy

Today we all are living in technological driven world where online learning has become an important and totally worthwhile way of learning on-the-go. Now our future of higher education lies in the hand of the online learning system. Nowadays college and university students find themselves burdened with Jobs and family commitments and having an option of studying at their own time has become a critically important part of their life, as its very convenient and less expensive for most of the students moreover, You can work on any course just about anywhere you have computer access.

Because of the expanding trend of online learning platforms like Udemy, khan academy, now the question arises is that how can we make our own online learning platform, what are the core technologies involved in developing such systems, the answer to that is Application programming interface (APIs). APIs are sets of instructions or requirements that govern how one application can communicate with another.

The function of an API is usually fairly straightforward and simple, the process of choosing which type of API to build, understanding why that type of API is appropriate for your application, and then designing it to work effectively is the key to giving your API a fairly long life and making sure that it’s used by developers.

There are many types of APIs available. For example, you may have heard of Java APIs or interfaces within different classes that let objects interact with each other in the Java programming language. Along with program-centric APIs, we also have Web APIs like the Simple Object Access Protocol (SOAP), Remote Procedure Call (RPC), and the most popular at least in name, Representational State Transfer (REST).

There are more than one alternatives

If you’re looking for building your own platform for e-learning like Udemy, it’s important to decide which type of method you have in mind for the delivering lectures of courses that are offered, it can be audio, video or simple text. Video lectures are more in trend these days so now it’s important to know  how to make your own live streaming videos for course lectures, there are a lot of APIs that can offer to make an application that is user friendly and fast but for specific live video streaming Castasy is a good way of doing so as it’s a  cost efficient solution that has arrived in the form of a software  This new live streaming software comes with compatible versions for both iOS and Android devices and also comes in a desktop version. The software basically allows the user to have an application and website that could stream live videos with their own live streaming software. The user is capable of allowing access or denying access to any follower. Each video gets a separate URL and posting that specific URL in their browser, users can view the video at their desktops with the website version of that software. With different URLs users have the facility to view a number of videos available in the website version of the software The live streaming software also withholds a chat feature facilitating viewers to chat on videos as they are streamed so they can discuss relevant topics related to that video it’s a very good feature for e-academies as it helps the students to discuss different queries through chat.

Now if we talk about the most popular, known and very efficient API developer Citrix, Gotowebinar, and Udemy usually comes into the person’s mind now let’s look at them one by one and in detail.

What Citrix basically do is that these applications are streamed from a centralized specific location into an isolated environment where they are executed on different target devices. Application configuration, settings, and relevant files are copied to the client device. When you start the session virtualization, applications are delivered from hosting servers in the data center with the help of application streaming. The user is then connected to the server to which that specific application was delivered. The application is then executed on the server, and the server power is maximized. While the server receives mouse clicks and random keyboard strokes, it sends all the screen updates to the end user device.

GoToWebinar is a purpose-built for do-it-yourself Webinars, making it easy for multinational organizations to deliver their message to thousands of people at the same time, eliminating costly travel or expensive marketing promotions. Innovative in-session features and reports help businesses evaluate the success of their Webinars and to judge whether it was successful or not .it’s actually a Citrix production but it’s usually considered as a different API.

If we look at Udemy as an API we see that Depending on our intended use case, we may be interested in creating our own courses, basically our own platforms for e-learning, it helps us in developing that certain stage, we can consume premium courses, or develop our own through Udemy it’s an easy way to provide services online and earn a little bit of fortune.

API’s pricing benefits Availability
Gotowebinar For starters, it costs $89.00 and can provide services for up to 100 participants

For Pro it costs

$199.00/mo and can entertain up to

500 Participants

For plus it costs $429.00 and can provide services for

2000 Participants

·      Reliable

·      Ease of use

·      Cost efficient

·      Saves time and money that is otherwise consumed on marketing

Easily available in the US and outside of US
Citrix It ranges between 80$ to 500$ ·      standardized, common setup.

·      compress the data

·      it’ll encrypt the data for security

·      the performance is faster

·      centralized management

Easily available all around the globe
Udemy ·        list prices of Udemy range between $20 – $200.

·       Larger discounts are offered.

·       We can run promotions if different courses in 10 to 15$

 

 

  • The ability to create your own courses
  • The easiest opportunity to centralize your training materials
  • Easy Management of users and courses

 

Available all around the world

 

It is not as hard as you may think

Every API technologies have a lot of benefits and mostly are available all around the globe if we want to build our own e-learning platform it’s easier to utilize these APIs rather than developing our own, as its cost efficient and gives us all the desirable features whether it’s online streaming of lectures or publicity of a certain seminar they provide us with every feature necessary to develop our own Udemy .