Build Your Own Udemy

Today we all are living in technological driven world where online learning has become an important and totally worthwhile way of learning on-the-go. Now our future of higher education lies in the hand of the online learning system. Nowadays college and university students find themselves burdened with Jobs and family commitments and having an option of studying at their own time has become a critically important part of their life, as its very convenient and less expensive for most of the students moreover, You can work on any course just about anywhere you have computer access.

Because of the expanding trend of online learning platforms like Udemy, khan academy, now the question arises is that how can we make our own online learning platform, what are the core technologies involved in developing such systems, the answer to that is Application programming interface (APIs). APIs are sets of instructions or requirements that govern how one application can communicate with another.

The function of an API is usually fairly straightforward and simple, the process of choosing which type of API to build, understanding why that type of API is appropriate for your application, and then designing it to work effectively is the key to giving your API a fairly long life and making sure that it’s used by developers.

There are many types of APIs available. For example, you may have heard of Java APIs or interfaces within different classes that let objects interact with each other in the Java programming language. Along with program-centric APIs, we also have Web APIs like the Simple Object Access Protocol (SOAP), Remote Procedure Call (RPC), and the most popular at least in name, Representational State Transfer (REST).

There are more than one alternatives

If you’re looking for building your own platform for e-learning like Udemy, it’s important to decide which type of method you have in mind for the delivering lectures of courses that are offered, it can be audio, video or simple text. Video lectures are more in trend these days so now it’s important to know  how to make your own live streaming videos for course lectures, there are a lot of APIs that can offer to make an application that is user friendly and fast but for specific live video streaming Castasy is a good way of doing so as it’s a  cost efficient solution that has arrived in the form of a software  This new live streaming software comes with compatible versions for both iOS and Android devices and also comes in a desktop version. The software basically allows the user to have an application and website that could stream live videos with their own live streaming software. The user is capable of allowing access or denying access to any follower. Each video gets a separate URL and posting that specific URL in their browser, users can view the video at their desktops with the website version of that software. With different URLs users have the facility to view a number of videos available in the website version of the software The live streaming software also withholds a chat feature facilitating viewers to chat on videos as they are streamed so they can discuss relevant topics related to that video it’s a very good feature for e-academies as it helps the students to discuss different queries through chat.

Now if we talk about the most popular, known and very efficient API developer Citrix, Gotowebinar, and Udemy usually comes into the person’s mind now let’s look at them one by one and in detail.

What Citrix basically do is that these applications are streamed from a centralized specific location into an isolated environment where they are executed on different target devices. Application configuration, settings, and relevant files are copied to the client device. When you start the session virtualization, applications are delivered from hosting servers in the data center with the help of application streaming. The user is then connected to the server to which that specific application was delivered. The application is then executed on the server, and the server power is maximized. While the server receives mouse clicks and random keyboard strokes, it sends all the screen updates to the end user device.

GoToWebinar is a purpose-built for do-it-yourself Webinars, making it easy for multinational organizations to deliver their message to thousands of people at the same time, eliminating costly travel or expensive marketing promotions. Innovative in-session features and reports help businesses evaluate the success of their Webinars and to judge whether it was successful or not .it’s actually a Citrix production but it’s usually considered as a different API.

If we look at Udemy as an API we see that Depending on our intended use case, we may be interested in creating our own courses, basically our own platforms for e-learning, it helps us in developing that certain stage, we can consume premium courses, or develop our own through Udemy it’s an easy way to provide services online and earn a little bit of fortune.

API’s pricing benefits Availability
Gotowebinar For starters, it costs $89.00 and can provide services for up to 100 participants

For Pro it costs

$199.00/mo and can entertain up to

500 Participants

For plus it costs $429.00 and can provide services for

2000 Participants

·      Reliable

·      Ease of use

·      Cost efficient

·      Saves time and money that is otherwise consumed on marketing

Easily available in the US and outside of US
Citrix It ranges between 80$ to 500$ ·      standardized, common setup.

·      compress the data

·      it’ll encrypt the data for security

·      the performance is faster

·      centralized management

Easily available all around the globe
Udemy ·        list prices of Udemy range between $20 – $200.

·       Larger discounts are offered.

·       We can run promotions if different courses in 10 to 15$

 

 

  • The ability to create your own courses
  • The easiest opportunity to centralize your training materials
  • Easy Management of users and courses

 

Available all around the world

 

It is not as hard as you may think

Every API technologies have a lot of benefits and mostly are available all around the globe if we want to build our own e-learning platform it’s easier to utilize these APIs rather than developing our own, as its cost efficient and gives us all the desirable features whether it’s online streaming of lectures or publicity of a certain seminar they provide us with every feature necessary to develop our own Udemy .

Cloud Computing is every #Startup’s #CTO best friend

The needs of a startup:

Chief technology officers play a major role in managing the technical aspects of a company, especially for startups. The requirements of a company in the early stages differ considerably from its requirements in the later stage. For most startups, the initial period is turbulent; the market waters harsh and finding loyal partnerships cumbersome. For CTOs, this period can be exceedingly stressful, they have to manage and ensure the entire operation of the company runs smoothly at every point. As the world advances into digital zones, the burden on CTOs has increased. Initially, the company may hire a lot of IT professionals to take care of technology needs, however, as time goes on, these professionals would be cut down and some advance and take on more responsibility. The later stages of a startup are more secure and stable, by this point CTOs already have their strategy in motion, they have hired professions to handle technology work and their major role lies in super vision. However, during the middle region, CTOs can face numerous challenges. From finding the right balance in the company, managing resources, storing data, keeping the company wired, operational and connected to the market, can be a hurdle. However, diligent CTOs manage the company needs, keeping their eye on the end price.

The role of a CTO

Chief Technology officers are required to maintain the smooth functionality of technology, while reducing expenses of the company. Micro-level events are exceptionally useful for CTOs and they are always on the look at for changes that might occur at this level. For example, ways in which digital technology can be improved. Since data is the basic tool of most companies, CTOs often look for ways to improve high data throughput. The technology market and all its innovations are always under the radar of Chief technology officers. These people do not invest impulsively; rather make calculated decisions to ensure that every investment results in incremental growth and money savings for the company. CTOs look at market trends and environments, the evolutions that take place and the competition they face in the market. Moreover, these officers pay diligent attention to customer preferences and buying habits. These two aspects show the company how to market products so they become more appealing to customers. These customer needs are evaluated on a five year basis, as customer preferences change only slightly during this time frame. However, if certain technological advances make big waves in the market environment, then CTO’s are required to change their strategies accordingly.  While these are the basic requirements and credentials of CTO’s, hiring equally qualified tech experts also falls under their domain. CTO’s are also required to manage their team, and ensure every department and their technology needs are fulfilled, and run smoothly at all times.

 

front

What is Cloud Computing?

Cloud computing or internet based computing is on demand access to a number of configurable computer resources. These resources can include computer networks, data, storage, servers, applications and other services. The services can be dispatched with minimum management, and are normally safer, and more reliable for data needs. Cloud storage and computing give customers and companies the platform to safely store their data, privately and even remotely. In some cases outsourced companies may be involved in providing the services, however, other cloud based computing are very personalized.

Cloud computing and services can really reduce the cost of the technology infrastructure of a company. For startup companies, the costs are already high and initial revenue low, hence for such companies, cloud computing provides and easy, accessible and cheap option, as they do not need to buy separate servers. By taking care of the IT needs of organizations, it gives companies the leverage to focus on central issues and core business goals. Moreover, it allows CTOs to manage the technology needs faster, more professionally, and in a systematic manner. When such professionals have to take care of big data and services on a daily basis, they rarely find the time to focus on more important issues at hand, managing the technology resources. Moreover, since these servers are outsourced, maintenance costs are negligible for the company. In addition, it also reduces the personnel need of a company, and hence cuts costs considerably.

While cloud computing can offer a range of benefits to companies, there are some draw backs as well. Public cloud computing options are very risky, and in the past, there have been countless breaches that have resulted in loss of personal information from companies. This information can include sensitive credit card information, employee details or any company data. Usually hackers release such information on social media outlets, and this can cause the public image of a company to be in jeopardy. There have been numerous documented cases of theft and cyber hacking on public cloud computing, however, it is uncommon in Private cloud computing. None the less, the risks associated are very high, and due to the remote nature of the vice, the criminal can be very hard to track down.

Cloud Computing for CTOs: Design solutions in Cloud

Cloud Computing can offer a lot to companies, especially CTOs. Not only are there many cost saving benefits of employing such a service, but, most technology aspects of the company get assisted by the service. Cloud computing solutions are cheaper for companies, and by outsourcing data and IT needs, CTOs can focus on what truly matters, designing solutions to run the company seamlessly. The data becomes much easier to manage for officers, becomes more transparent and storage issues rarely arise.

Amazon’s CTO, Werner Vogel has already spoken about the benefits he has reaped from cloud computing in his company. Vogel advocated the services in a conference, stating, “the cloud has nothing to do with technology, the cloud is defined by all its benefits”.

While apps and gadgets can take care of data storage needs, for companies and startups the cost of download could be great, by investing in cloud services, this downtime can be prevented.  According to Vogel, if Cloud services lower their costs and make tackle privacy issues, companies would advance at an alarming rate.

 

 

 

Cultural Heritage #APIs: What are they, why we need them

Introduction

Currently, it is of very great significance for the cultural heritage scholars to highly focus on the area of application programming interface (APIs). Generally, APIs are considered to be in very simple terms, code libraries that are assembled by various companies offering web service with a major objective of enabling third-party applications to be able to link up and make communications with web service platforms[1]. To define cultural heritage APIs, it refers to an expression of different ways of living that are normally developed by a community, and then passed from one generation to the other comprising of practices, customs, values places, and artistic expression achieved through the help of technology.

Usually, API is invisible to the human eyes regardless of the fact that APIs are interfaces that are linked to facilitate computer-to-computer communication. Inside the domain of cultural heritage, there is a very implausible potential create tools that can help in revolutionizing both the presentations of different collections and the way in which people happen to experience and interact with cultural heritage. Cultural heritage APIs are of very great significance giving the reason as to why we need them. Taking, for example, it is through cultural heritage APIs that we are able to discover the Europe’s rich cultural heritage that can be found in the museums, libraries, galleries, cultural oriented institutions, and different archives in the whole content of Europe. In addition, it is through the digitization efforts that enables people from different parts of the world to get to know about this heritage through different online platforms[2]. This simply means that cultural heritage APIs helps in the preservation of the important cultural materials.

Museum REST APIs

In application program interfaces, REST is used as the type of architecture style that is meant for the design of networked application whereby there is a use of simple HTTP in the making of calls between machines. There are different categories of museums APIs whereby they are all designed by implementing REST so that they can provide the expected purpose when it comes to the cultural heritage. These categories comprise of arts, education, and location. In the category of arts, museums APIs aim at integrating the entire museum’s collection into an application hence allowing different users to have access to data concerning objects, people, exhibitions, galleries, and publications. Under the education category, museums APIs are designed as a REST-ful interface depending to a particular museum’s collections so that all the items searched can be returned in the database while they are already paginated in either JSON or SML format.

Availability of Museums REST APIs

Looking at the availability of museums REST APIs, in most cases, it involves the use of Gerrit code review which normally comes with a REST-like API that is always available over the HTTP. In this case, the API happens to be very suitable when it comes to the automated tools to build upon and also supporting a number of ad-hoc scripting uses cases. It is through the construction of the API protocols as well as the documentation that different web service provider companies enable people to have access to the data in regard to cultural heritage they may want to access.

The Necessity of Museum APIs

There are a number of significances linked to the museum APIs in regard to the cultural heritage. To start with, use of Artsy API, it helps in providing access to images of a particular historic artwork as well as related information when it comes to artsy being one of the category of museum APIs meant for educational and some other non-commercial purposes[3]. The other significance is that it helps in providing access to positions, luminosity, color, and some other data to people in different locations of the world when it comes to the exoplanets and constellations. Allowing an individual to search a diverse body of online primary resources that relates to written and early printed culture in a particular state, taking for example of the Britain culture at the period of 1000-1500, is the other necessity of museum APIs lying under manuscripts online API. The other significance is on the connected histories API whereby it brings together a range of digital resources when it comes to the early modern as well as the 19th century Britain with a single federal search that normally enables a highly modernized searching of names, places, and dates, connect, and shares resources with a private workspace. Usually, the connected histories APIs allow different users to connect programmatically when it comes to the search engine through the use of GET parameters whereby results can be retrieved in an XML format[4].

Future Thoughts of Museum REST APIs

The future thoughts of the museum APIs involves determining where next to open cultural data in museums[5]. It is evident that recently, museums have increasingly been integrating the global movement when it comes to the open data through initiating of their databases, images sharing as well as releasing of their knowledge. The other future thoughts involve determining on how to come up with a way that can be used to in open data so that there can be engaging of more people and more diverse individuals taking, for example, of the United Kingdom heritage and culture.

Cloud Technologies Used by Museums

Cloud technology or computing can be defined as a natural progression when it comes to the utility of computing. Normally, early computers would require a number of users to share a single console which has currently come to an end as the advent of personal computing brought about the convenience into our homes[6]. The Internet has completely changed the way in which people link up to information and each other. Museums are known to provide access to their different collections and programs through the hosting of websites as well as the applications for the public use.

Museums make use of software as a service (SaaS) and Platform as a service (PaaS) as its main types of cloud computing[7]. For the PaaS system, they normally enable access when it comes to the virtual hardware devices that always allow the software toolsets that can be most appropriately used. SaaS systems normally require users to deploy software as per a particular interface with a major objective of hosting applications[8].

Through providing of API only access, the implementation of SaaS can have the capability of providing features that can help developers to build upon considering the aspect of scalability as the most significant one. Consenting the cloud to take control of where the data gets to be stored and where the applications happen to be stored through a SaaS system helps in removing a significant amount of workload as well as the complexity of the developer. For the PaaS, museums consider using it due to its benefit of ease in moving applications from one servicer provider to the other. Museums considered there switching to the cloud computing technology due to these operational advantages linked to the cloud computing technology and environmental effects through the use of services in the cloud[9].

When it comes to the data exposed by the cloud technologies used by the museum’s work, they normally get to be used by some other museums in the worldwide basis so that they can be able to advance their museums work with a major objective of preserving the cultural heritage in different parts of the world.

REST APIs are considered to be very significant and useful due to the fact that they do allow different users to have access to data portals when it comes to different museums natural history with an objective of retrieving the collection and research datasets for different uses in software or applications. Talking, for instance, of the London natural history museums, the datasets get returned in JSON whereby it holds over 2 million specimen records that are from the museum’s zoology, botany, and entomology collections.

 

 

Bibliography

Armbrust, Michael, Armando Fox, Rean Griffith, Anthony D. Joseph, Randy H. Katz, Andrew Konwinski, Gunho Lee, David A. Patterson, Ariel Rabkin, Ion Stoica, and Matei Zaharia. Above the Clouds: A Berkeley View of Cloud Computing. Berkeley, CA: Electrical Engineering and Computer Sciences University of California at Berkeley, 2009. Accessed September 6, 2016. https://www2.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009-28.pdf

Fernando, Niroshinie, Seng W. Loke, and Wenny Rahayu. “Mobile cloud computing: A survey.” Future Generation Computer Systems 29, no. 1 (2013): 84-106.

Isaksen, Leif. Pandora’s Box: the Future of Cultural Heritage on the World Wide Web. (2009): 110-130. Accessed September 6, 2016. https://leifuss.files.wordpress.com/2008/08/pandorasbox.pdf

Johnson, Larry, Samantha Adams Becker, Victoria Estrada, and Alex Freeman. The NMC Horizon Report: 2015 Museum Edition. Austin, TX: The New Media Consoritum, (2015):190-205.

Rosenthal, Sara, Alan Ritter, Preslav Nakov, and Veselin Stoyanov. “Semeval-2014 Task 9: Sentiment Analysis in Twitter.” In Proceedings of the 8th International Workshop on Semantic Evaluation, 73-80. Dublin, Ireland: SemEval, 2014.

Sucan, Ioan A., Mark Moll, and Lydia E. Kavraki. “The Open Motion Planning Library.” IEEE Robotics & Automation Magazine 19, no. 4 (2012): 72-82.

 

 

[1]Michael Armbrust, Armando Fox, Rean Griffith, Anthony D. Joseph, Randy H. Katz, Andrew Konwinski, Gunho Lee, David A. Patterson, Ariel Rabkin, Ion Stoica, and Matei Zaharia. Above the Clouds: A Berkeley View of Cloud Computing. (Berkeley, CA: Electrical Engineering and Computer Sciences University of California at Berkeley, 2009), 530.

[2] Leif Isaksen. Pandora’s Box: the Future of Cultural Heritage on the World Wide Web. (2009), 120.

[3] Sara Rosenthal, Alan Ritter, Preslav Nakov, and Veselin Stoyanov. “Semeval-2014 Task 9: Sentiment Analysis in Twitter.” In Proceedings of the 8th International Workshop on Semantic Evaluation (Dublin, Ireland: SemEval, 2014), 73-80.

[4] Ioan A. Sucan, Mark Moll, and Lydia E. Kavraki. “The Open Motion Planning Library.” IEEE Robotics & Automation Magazine 19, no. 4 (2012): 72-82.

[5] Larry Johnson, Samantha Adams Becker, Victoria Estrada, and Alex Freeman. The NMC Horizon Report: 2015 Museum Edition. (Austin, TX: The New Media Consoritum, 2015),192.

[6] Ibid., 200.

[7] Niroshinie Fernando, Seng W. Loke, and Wenny Rahayu. “Mobile cloud computing: A survey.” Future Generation Computer Systems 29, no. 1 (2013), 90.

[8] Ibid., 100.

[9] Ibid., 102.

Exploring Cache options for Hypermedia #APIs

Application program interface (API) is a set of protocols utilized for building software applications. This directs how the components of software interact with one another and are very useful as they make it easier to develop the program and put all the units together. More recently Hypermedia APIs have become the latest hype, scaling better and promoting decoupling and encapsulation easily along with the myriad of advantages these things bring.

REST (Representational state transfer) is a popular architectural approach to communications that currently utilized in the development of web software. REST’s lighter communication between producer and consumer makes it a well-liked system for cloud-based APIs, especially those provided by Google, Mircosoft etc. Running over the HTTP (hypertext transfer protocol, this architecture has several components, one of the most important being able to control a cache. REST APIs operate in a simple manner, by picking up a base protocol like HTTP and model the resources with a URI. You can then map actions into HTTP methods. However, hypermedia augments this procedure by simultaneous presentation of information and control hence poses as a pivotal constraint of the web and as an extension of REST.

So what is Hypermedia control?

As the digital world exceeds in complexity, more and more people want different components of a system all within a single application, to be able to easily interact with one another clearly. This cross-linguistic data exchange is what APIs have always been evolved to do, however, languages used to code individual software applications do not correspond nicely with each other.

Hypermedia that is hyperlinked can circumvent this need and can cause applications to stay interconnected and communicate easily. Hypermedia APIs is a more evolved and phenomenal way of creating APIs by using the universally used and understood hyperlinks.

This hypermedia functionality can embed links into associated resources in order to describe the action it will perform. This allows end users to carry out the actions that have been prescribed in the resource.

Current situation of web caches:

Due to the dynamic nature of web application, a lot of data is required to be stored. With every new request HTTP, caching is needed. With this caching, you cache the full output of the page requested, bypassing your application entirely on succeeding requests. While this is the current situation of web caches, it isn’t always possible for highly dynamic sites, which can only use the power of HTTP caching on fragments on these sites.

Caching with Hypermedia APIs:

Of the major issues with APIs and web-based architecture on a network is that optimum performance is acquired when the network is not in use, as this is the dawdling part of any application. Hypermedia APIs however, allow a greater caching of data on the browser cache through two ways, either a content delivery network or through a proxy on the server. This altogether reduces dependency on the network server infrastructure, hence increased speed considerably.

In addition, there is also a possibility of remote procedure call (RPC) with hypermedia APIs, where dynamic elements can be edge cached. In certain popular websites, this allows the trendy products or articles to be updated frequently along with the comments, opening a more interactive website.

One of the keys to successful Hypermedia API is its use of caching such as local caches and Etags. Moreover, in large systems cache invalidation can prove to even more costly than not caching at all. However, through careful designing of hypermedia resources, you can easily enable your system to depend on natural behaviors that are built into HTTP protocols.

Other benefits of Hypermedia APIs:

There are other benefits of Hypermedia APIs as well. These can enable applications to browse an API from the root or any resource, acting in a way that web works with hyperlinks. This needs API federation, where media APIs are interlinked with content API.

Moreover, it allows for more flexible and evolvable systems, as URIs remain persistent or unchanged, all configurations are done at the API root. If under any circumstance, the API is faulty, no damage will be suffered by the URL infrastructure and hence, this proves to be very cost effective.

In addition, Hypermedia APIs are also extensible. This means that more links and forms can be added without the fear of API breakage. These will also simply be ignored if they are not supported by the application of the client. Hence, validation and guidance of client are ensured.

Hypermedia APIs are simple and enable scalability from small to large single-page apps. They also enable standardization when API calls and made and the information if returned. The various links and forms are affordances in the hypermedia infrastructure and can be directed after the validation from the client side. While this advantage is still in research phase, it can prove to be very beneficial for certain businesses.

Another subtle advantage of hypermedia APIs over traditional APIs without hyperlinks is that in the latter you can only expose information that the client has requested, not caring what it is being utilized for. In a Hypermedia API, you are conscious of the workflow and can actively guide the client by providing links. This creates a superb communication between clients and server developers.

Limitations of hypermedia APIs:

Despite the obvious benefits, Hypermedia APIs still have certain drawbacks; however, these are just certain barriers that have to be climbed to allow for the adoption of this wonderful technology.

One of the most critical arguments against Hypermedia APIs is that the payload is higher for serialization of JSON, unlike thrift objects. However, overall if API actions are taken copiously, less data needs to be transported, so procedures like call sessions might not increase by such a high amount.

Future of Hypermedia APIs:

As systems become even more intricate, the need for universal language among diverse systems will be needed. Hypermedia, despite its limitations, will be the panacea in overcoming such communication gaps, especially in areas where Big Data and caching is concerned. Hypermedia is powerful and its full potential has yet not been discovered. It might even someday leave today’s REST and HTTP methods behind, relying solely on its ability to link through the common protocol, unorganized distributed data. How it completes this task only time will tell.

How #music-streaming site Orfium used Varnish Cache to improve page performance

This is a hosted blog post I wrote recently for section.io.

Background: Implementing Varnish Cache with section.io.


Orfium is a new music platform which combines some of the best features of existing platforms such as SoundCloud, Bandcamp and Beatport, allowing users to share their tracks, promote them and sell downloads. In addition, it promises to introduce a range of interesting new options for music makers and labels, including the payment of streaming royalties and the ability to upload DJ mixes while compensating all artists whose music is included.

Uniquely, Orfium takes an artist-driven approach to music. The platform could prove to be a strong rival to existing sites such as SoundCloud, which have come under fire in recent months for their heavy-handed approach to handling copyright infringements and their apparent focus on securing lucrative licensing deals rather than listening to the voices of their users.

Why we started using section.io to set up Varnish Cache


Orfium is a music platform that enables people to browse and listen to music. Users can signup and login so that they have a personalised experience in our platform. However, we also enable users to search for music and listen to samples or popular playlists without logging in. All public information is also optimised and available for search engines. Both of these characteristics mean that a huge amount of our traffic comes from non-authenticated sessions, which Orfium welcomes. We also have a large amount of traffic that comes from social media and blog references.

Thus, brainstorming at the office we were thinking that we could definitely optimise the performance of non-authenticated sessions, especially at specific spikes of traffic. Varnish cache seemed from the very beginning to meet our list of requirements:

  • Cache the whole http response
  • Have the ability to customise the rules of caching
  • It is a tested and scalable open source solution

The very first thing we did was to create an EC2 Varnish instance to evaluate that it works on our staging server. It passed all the tests, including the load testing ones that were the most important for us to adopt any specific technology. We were impressed with how it could perform without exploding our staging servers.

It is in our mentality as a company, “if someone already does something (well) and we need this, we pay for it”. After evaluating several Varnish solutions as a service, section.io seemed easy to configure and immediately passed all of our tests when we added our own Varnish configuration.

orfium screenshot

How section.io helped the Orfium platform


After a couple of weeks in the staging environment with section.io we decided to release our Varnish cache configuration in the production environment. The configuration was super simple, and it required only a couple of minutes to setup the new DNS settings and then update the Varnish settings.

The most impressive thing was that from Day One of this release we could see that our backend analytics “relaxed” immediately, since most of the non-authenticated traffic was routed directly through section.io and Varnish. Day after day, everything seemed to work even better, and this enabled us to start working on how we can cache chunks of information that are shared amongst users, even though every user has their complete personalised Orfium experience.

orfium dashboard

section.io’s CDN support


First thing first. Every time I am about to choose a new platform for production I value the customer support more than anything. I don’t want to ever have a production issue on Sunday morning and nobody be there to resolve this. Thus, I always check again and again that the support is there for us to resolve both simple/stupid questions as well as the more technical stuff.

I have to say, this is something like the best support I ever received. They guys even checked on my Varnish code configuration to evaluate it. Every other problem we had was resolved in less than half and hour.

Next Steps for optimizing our CDN configuration


After evaluating Varnish technology with section.io in a production environment, we feel confident that we can dive deeper into using more aspects of section.io as a platform, find new ways to decrease http responses and offer an even better experience to our users.

One important lesson that we learned from section.io is that having quality customer support is hugely beneficial to a technology company, not only because people are going to trust your service over another, but also because people will actually adopt a new technology more easily when they have someone to support them. Obviously this is not a new lesson in computer science, as such practices have been there for decades, but this is a valuable lesson for cloud service offerings that are often not all that open and helpful. Even though good customer support takes effort, it is important when dealing with web performance and security, and we were pleased that the section.io team was always able to answer our questions.

#Data as a Service: REST #APIs Transforming the Cloud era

History of Cloud Computing

Cloud computing is a kind of Internet-based computing that offers pooled computer processing resources and data to computers and other devices on demand. It is often referred to as “the cloud” delivery of on-demand computer resources, everything from applications to data centers, through the Internet, normally, on a pay-for-use basis (Armbrust et al., 2010). The term “time sharing” is the foundation of cloud computing in the 1950s; back then mainframe computers were huge occupying plenty of room and due to the cost of purchasing and sustaining mainframes, organizations could not meet the expense of buying them for each user. The solution, therefore, was “time sharing” in which multiple users shared the entrance to data and CPU time. In 1969, J.C.R Licklider established the ARPANET (Advanced Research Projects Agency Network); his idea was for global interconnection and access to programs and data at any site from any place (Mohamed, 2009). This network became the basis of the internet.

In the 1970s, IBM launched an operating system known as VM which permitted admins to possess multiple virtual systems or “Virtual Machines on a single physical node (Mohamed, 2009). The VM operating system took the “time sharing” to the next level, and most of the primary function of virtualization software can be drawn to the VM operating system. The 1990s telecommunications companies began offering virtualized private network connections (Mohamed, 2009). It allowed more users to the same physical infrastructure through shared access. The change enabled traffic shift as necessary to enable better system balance and more mechanism over bandwidth usage. In the interim, PC-based system virtualisation began in solemn, and as the internet became more manageable, online virtualisation logically fell into place. Cloud computing came in around 1997. In 2002, however, Amazon created Amazon Wed Service (AWS) providing a cutting-edge system of cloud services from storage to computation (Mohamed, 2009). Amazon also introduced the Elastic Compute Cloud (EC2) as a commercial Wed service which allowed companies rent computers on which they were able to run their computer applications. 2009, Google and Microsoft joined, the Google App Engine brought low-cost computing and storage services, and Microsoft trailed suit with Windows Azure (Mohamed, 2009). The Reserve field service management software passages to the cloud.

cloud-computing.png

History of REST APIs

In understanding the history of REST APIs, APIs history comes first. Modern web APIs were legitimately congenital with Roy Fielding’s dissertation Architectural Styles and the design of Network-based Software Architectures in 2000 (Lane, 2012). Web APIs first seemed in the wild with the outline of Salesforce in February. Salesforce was an enterprise class web-based, sales force automation, as an “Internet as a service” with XML APIs were a fragment of Salesforce.com from the first day. On November the same year, eBay launched the eBay Application Program Interface (API) along with the eBay Developers Programs (Lane, 2012). Amazon started Amazon.com Wed Services which allowed developers incorporate Amazon.com content and structures in their websites. AWS also enabled third party sites search and display products from Amazon.com in an XML format. Amazon E-Commerce kicked off the modern Wed API movement (Lane, 2012).

Web APIs got traction when things got social. In 2004, Flickr launched their API which was attained by Yahoo later (Lane, 2012). The inauguration of RESTful API made Flickr become the imaging policy of choice for early blogging and social media movement. Users were allowed to entrench their Flickr photos easily into their blogs and social network streams. Facebook also launched their API Version 1.0 of Facebook Development Platform which enabled developers access Facebook friends, photos, events and profile info for Facebook (Lane, 2012). Twitter followed suit and introduced the Twitter API, and Google launched their Google Maps APIs. As the APIs were making social thrill across the internet, Amazon recognized the potential in RESTful APIs and launched a new web service Amazon S3 (Buyya, 2008). It delivered a simple interface that can be for storing and to reclaim any amount of data at anytime from anywhere on the internet. It offers developers access to vastly scalable, dependable, fast, and cheap data storage infrastructure, same as Amazon usages, to run its global network websites.

Necessity of REST APIs

REST is a set of principles that elaborate how Web standards like HTTP and URLs are supposed to be used. Its purpose is to convince performance scalability, simplicity, portability, visibility, modifiability, and reliability. It is just a series of guidelines and architect styles used for data transmission, and it is commonly applied to web applications. RESTful Application Programming Interfaces (APIs) are APIs that follow the REST architecture (Lozano, Galindo, & García, 2008). REST became necessary and important for minimizing the combination between clients and server mechanisms in a widespread application. In the case, a server is going to be used by various customers, and the developer has no control over, REST plays a part in managing the clients. REST is also necessary when one needs to update the server commonly without interfering in updating the customer’s software. Rest is in the world over; it is part of the web that makes it work so well

Recent Advancement in REST APIs

REST API for Atlassian application is among the recent advancement in REST APIs where Atlassian application exposes REST APIs for developers to use and access services of the Atlassian platform (Yates et al., 2014). These RESTAPIs provide an unconventional to the Java APIs utilized in the process plugins; they offer variation tolerance than in-process APIs. WordPress JSON also embraced WordPress JSON REST APIs in the future of the platform (WordPress, 2011). There is a separation between the client and server about in place and no need to be either inside WordPress front nor end admin panel for any requests to be read or executed (WordPress, 2011). The integration of the JSON REST API marks the ultimate evolution of WordPress from its humble backgrounds as a blogging solution into an entirely featured application platform, it is a lightweight data interchange format and based on a subset of JavaScript code language. The WP API allows one to take CRUD (Create, Read, Update, and Delete) actions to many various kinds of WordPress content posts, pages, media, comments, etc. The REST API gives languages instant access to the complete range of WordPress native functionality. The REST API also allows mobile developers to have the capability to treat WordPress installs like any other server. Another ability that WordPress gives is that use of the front-end of WordPress will be stringently optional (Katayama, Nakao, & Takagi, 2010). Additionally, they allow batch requests where one can make requests of multiple varied endpoints from the REST API in one HTTP request.

cloud-computing-elearning-path-cloud-r100

Open source projects are furthering software practices based on RESTful APIs. SmartBear Software launched an open source project under the governance of the Linux Foundation called the Open API Initiative (OAI) to establish standards and guidance for whole REST APIs are defined  (Katayama et al., 2010). The major goal of OAI is to describe a standard, language-agnostic interface to REST APIs that enables computers and users to discover and comprehend the abilities of the service without access to source code, documentation, or through network traffic check-up.

Future of REST APIs

RESTful APIs are regarded as superior to service-oriented architectures and cloud computing, and microservices are working to make RESTful API design the rule in the future. Daniel Bachhuber sees the REST API going further, his says:

I believe WordPress to be the embodied of core philosophies, then a particular manifestation of software: ownership over personal data, design for users, commitment to backward compatibility, and so on. The WP REST API is the foundational component of WordPress being embedded within 100% of the web” (Schiola, 2016).

The loT developers need REST without needless bloat, both HTTP and JSON. For JSON the future of loT is austere, and the REST model is a strong fit for loT. REST holds the future, since it allows building infrastructure for organizations with fewer worries about longer-term hitching to a particular client-side track, the server will always live longer than the client (Lanthaler & Gütl, 2012). Another key idea in this REST APIs architectural philosophy is that the server supports caching and is stateless.

 

 

References

Armbrust, M., Fox, A., Griffith, R., Joseph, A. D., Katz, R., Konwinski, A., … & Zaharia, M. (2010). A view of cloud computing. Communications of the ACM53(4), 50-58.

Buyya, R., Yeo, C. S., & Venugopal, S. (2008, September). Market-oriented cloud computing: Vision, hype, and reality for delivering it services as computing utilities. Proceedings of 10th IEEE International Conference High Performance Computing and Communications, (pp. 5-13). Los Alamitos, CA: IEEE CS Press.

Katayama, T., Nakao, M., & Takagi, T. (2010). TogoWS: integrated SOAP and REST APIs for interoperable bioinformatics Web services. Nucleic Acids Research38(suppl 2), W706-W711.

Lane, K. (2012). History of APIs. API Evangelist. Retrieved from: http://apievangelist.com/2012/12/20/history-of-apis/

Lanthaler, M., & Gütl, C. (2012, April). On using JSON-LD to create evolvable RESTful services. Proceedings of the Third International Workshop on RESTful Design (pp. 25-32). Rio de Janeiro, Brazil: ACM Press.

Lozano, D., Galindo, L. A., & García, L. (2008, September). WIMS 2.0: Converging IMS and Web 2.0. designing REST APIs for the exposure of session-based IMS capabilities. In The Second International Conference on Next Generation Mobile Applications, Services, and Technologies, (pp. 18-24).

Mohamed, A. (2009). A history of cloud computing. Computer Weekly27. Retrieved from: http://www.computerweekly.com/feature/A-history-of-cloud-computing.

Schiola, E. (2016, January 20). The future of REST API: An interview with Daniel Bachhuber. Torque. Retrieved from: http://torquemag.io/2016/01/future-rest-api-interview-daniel-bachhuber/

WordPress. (2011). WordPress.org. Retrieved from: https://wordpress.org/

Yates, A., Beal, K., Keenan, S., McLaren, W., Pignatelli, M., Ritchie, G. R., … & Flicek, P. (2014). The Ensembl REST API: ensembl data for any language. Bioinformatics, 613.

A different Wikipedia for #Web #APIs

A couple of months ago I read an article by Kin Lane about efforts that try to create shareable API docs. Personally, I am a strong fan of creating your API definitions upfront, because it makes it easier to have an estimate of how it is going to look like for your client to integrate with your service, and also for your team to develop clients and tests. If you are interested on those matters, you can always have a look at a previous presentation I had made a couple of years ago.

This article is more focused though on one of the efforts mentioned by Kin. This is called APIs.guru and is an open source effort to create public documents on swagger from existing APIs( documenting public APIs in OpenAPI(fka Swagger) format). Starting from there, with existing tooling like the awesome API transformer you can generate RAML, WADL, API Blueprints and whatever they support or to plan. Another interesting alternative is it’s open-source analog api-spec-converter (when they will support more output formats).

 

The Project

It was not long before I starred the github project and Ivan (the mastermind behind the whole thing) contacted me and we started chatting about it.

What exactly is APIs.guru?

The overall project’s goal is to create a machine-readable Wikipedia for REST APIs with the following principals:

  • Open source, community driven project.
  • Only publicly available APIs (free or paid).
  • Anyone can add or change an API, not only API owners.
  • All data can be accessed through a REST API.

 

Building Around APIs.guru

Listed below are some of the core integration

Also used as test suite in following projects:

  • swagger-parser – Swagger 2.0 parser and validator for Node and browsers
  • SwaggerProvider – F# Type Provider for Swagger
  • ReDoc – Swagger-generated API Reference Documentation

Spreading the Word

Ivan has made it clear that he needs any help that anyone is willing to offer. Either

  • coding,
  • documentation,
  • dissemination
  • or just some feedback.

When you try to create a community project and build something around it, you should be open to talking to people and listen to what they say. Ivan has all the right mindset to run something like this.  It did not took long and I invited him to present at the API Athens Meetup

It was a nice event with tones and tones of API discussions and pizza as well!!

 

Join the Movement

join the movement

Become an API Guru!!

#APIs, #SmartCities, #IoT and more on @APIlama

Today I wanted to do some research on the topics that I like to post more about. Don’t get too alarmed, this is not an academic essay about the pros and cons of technology and stuff. I just “googled” the most hot topics of tech and the results can be found below.

To summarize this a little bit, those topics are:

Mainly I wanted to look out how those topics are discussed throughout the Internet and see if my personal gut is right on following after those areas.

It is pretty obvious from the graphic below that APIs, Web and Programming in general are pretty popular topics while Smart Cities and IoT only now start to pick up. Those were more or less what I was expecting to see. I was a little bit surprised with the peak of WWW the last year. This is something I did not see coming and I am not totally sure on why this happened.

I was also expecting to see more traffic on Smart Cities and IoT but maybe this means that Google needs more time to understand a new topic that people search and discuss about.

Screen Shot 2016-03-07 at 21.52.46

For this research I used Google Trends. This is a really nice tool to provide an overview of various topics across time and regions. You can find this specific research directly on Google Trends.

Regional Interest

Another area of interest apart from how those article perform over time, is how those perform over regions. Some of the results were surprising.

For example, I did not expect so many people from Sri Lanka or Tunisia to be interested on APIs. Mainly because I don’t know any developers over there 😛

Some other of the results were expected more or less one could say. For example it was no surprise that South Korea is so interested in IoT. Or Germany is searching a lot for World Wide Web, which is somewhat expected since they are trying to become the Silicon Valley of Europe.

What amazed me a lot, was the increased “googling” of Malta about Smart Cities. I did expect this for United Arab Emirates, Singapore and India who are considered pioneers on the area, but Malta was a surprise to me.With a little bit of searching I found out that Malta is deep on the Smart City game with their SmartCity project which btw, looks awesome!

Top Interests and Queries

Another interesting information that Trends show are the top topics regarding the overall category you are searching for and the most prominent queries.

I am only attaching the information about APIs, because the rest were not that interesting mostly cause either they are not that populated yet or they were not that relevant to a technical site.

Screen Shot 2016-03-07 at 21.53.52

Looking at the results, it totally makes sense that REST is there, and also JSON and Javascript. But Java???

46e039da6efd16c540b180b964d5a0278267ec0126da7e6f4fff0e3af0224b22

I am not a big fan of Java, but it is definitely not a term to attach with APIs. On the other hand, Java was the first to coin the term, but since then we’ve gone a loooong way.

In any case, it is definitely a good thing that people are looking for better programming paradigms and this is the whole thing about APIs.

APIS: Better programming, Better software quality

Create your Own Trend Graphics

This article is part of my research on the topics that I find “hot”. I would like to see what you also think about those trends and play with your own graphics.

Feel free to contact me with your ideas and I would love to share them in a next post!! As I have quoted many times in the past, this site is a collaborative effort and my posts are only a trigger for further discussion.

 

What is a “Smart City”?

It’s a fair question, but a hard one to answer.

Many larger municipalities have embraced the “smart city” concept in recent years, but definitions of the term — and examples of the ways technology is being used to make cities “smart” — run the gamut. Mayors and city CIOs usually talk about using sensors to, say, wirelessly manage streetlights and traffic signals to lower energy costs, and they can provide specific returns on investment for such initiatives — x millions of dollars saved over y amount of time, for example.

Other examples include using sensors to monitor water mains for leaks (and thereby reduce repair costs), or to monitor air quality for high pollution levels (which would yield information that would help people with asthma plan their days). Police can use video sensors to manage crowds or spot crimes. Or sensors might determine that a parking lot is full, and then trigger variable-message street signs to direct drivers to other lots.

Smart cities as places for fun

Those are some of the countless practical examples. But smart cities can also be fun. In Bristol, England, a custom-built infrared sensor system was added to street lamps for a few weeks in late 2014 to record the shadows of pedestrians walking by. The shadows were then projected back through the streetlights for others walking by later to see.

Called “Shadowing” and developed by Jonathan Chomko and Matthew Rosier, the initiative was intended as a public art installation. A winner of a Playable City Award, “Shadowing” helps illustrate how broad and elusive the definition of “smart city” has become.

That’s a good thing.

“A smart city shouldn’t just save money, but should also be attractive and fun to live in,” said Carl Piva, vice president of strategic programs at TM Forum, a global nonprofit association with 950 member organizations whose aim is to guide research into digital business transformation, including smart city initiatives.

“Being a smart city is more than being efficient and involves turning it around to make it fun,” Piva said.
In Kansas City, officials are working with Cisco to install various sensors, including controls from Sensity Systems, for new LED streetlights to improve operating efficiency. Other smart city sensors could be added later.

The Bristol “Shadowing” project was discussed at a recent forum in Yinchuan, China, attended by politicians and technology experts from around the world, Piva said. It was introduced by Paul Wilson, managing director of Bristol Is Open, a joint venture of the Bristol City Council and the University of Bristol that’s devoted to creating an “open, programmable city region” made possible by fast telecom networks and the latest software and hardware.

“Many smart city projects don’t have immediate ROI attached,” Piva said. “My personal reflection is that technology of the future will become more and more invisible to individuals, and the best success criteria will be people not really even noticing the technology. For the time being, that means seeing a lot of technology trying to talk to us or engage with us in various ways. Every city mayor and everybody running for election is now invested in making his city smart. You sort of need to attract businesses and want to attract individuals with talent and make it a prosperous place, to make it livable and workable.”

Piva said he has noticed that some cities want to focus on building technology communities, which seems to be a significant part of what Kansas City, Mo., is doing with an innovation corridor coming to an area with a new 2.2-mile streetcar line.

Other cities, especially in Brazil, are using technology to focus on fostering tourism, Piva said. “The common element of smart cities is the citizen and the need to have citizens involved and feel at home,” he explained.

Over and over, city officials talk about the smart city as needing to provide “citizen engagement.”

China’s focus on smart cities

China, which has multiple cities with more than 10 million residents each, has pushed forward with a variety of smart technologies, some that might rankle Americans because of the potential privacy risks they raise.

Piva said there are nearly 300 pilot smart city projects going on in a group of municipalities in the middle of the vast nation. “If you jump on a bus, you may encounter facial recognition, which will be used to determine whether you have a bus permit,” he said.

The city of Yinchuan has reduced the size of its permitting work force from 600 employees to 50 by using a common online process accessible to citizens who need anything from a house-building permit to a driver’s license, Piva said.

While Yinchuan’s payback on new permitting technology is easy to determine, “a lot of these ROIs are really hard to calculate,” Piva admitted.

A stark contrast to Yinchuan’s smart city initiative, which has a concrete monetary ROI, is in Dubai. Officials in that United Arab Emirates city are building a “happiness meter,” which will collect digital inputs from ordinary citizens on their reactions to various things. It could be used to evaluate the combined impact of the cleanliness of streets and the effectiveness of security checkpoints with an assortment of other measures. In some cities, citizen inputs regarding happiness may come from smartphones. But they also could come from digital polling stations. For example, users of airport bathrooms might click a happy face button at a kiosk if they thought the bathrooms were clean.

The theory behind happiness meters is that, if municipal officials can capture data from citizens about what it’s like to live in a city, “people will be more successful and take care of the community better,” Piva said. However, he acknowledged, “it’s a hard ROI to measure and takes lots of different touchpoints.”

A working definition of smart city

Ask just about any city official or technologist working for a city, and you are likely to get many different examples of a smart city. A strict definition is even harder to nail down.

Jack Gold, an analyst at J. Gold Associates, took a stab at a comprehensive definition but only after first jabbing at the broad ways the concept is used. “‘Smart city’ is one of those all-encompassing terms that everyone defines however they want,” he said.

But then, he added, “Really, a smart city is about having sensor data that then gets used to create actions. You can define a smart city as a city with better managed infrastructure that is variable, based on input of data and adjustments of the results to best utilize resources or improve safety.”

Piva and others might add that a city could use the data to improve the happiness of its visitors, residents and workers.

Gold added, “The ultimate goals of smart cities are power management, reducing pollution footprints, increasing public safety, or offering improved services to residents. The downside is that it takes investment infrastructure, and most cities don’t have a lot of extra dollars to invest. But it’s coming in small steps in many places.”

Vendors are lining up

In addition to big tech companies like IBM, Cisco, GE, Intel and others, there are hundreds of smaller vendors of hardware, software and apps that want to cash in on the smart city phenomenon, for example SmartCitiesSolutions.

In Kansas City, Cisco partner Sensity System, a provider of high-tech outdoor lighting, is installing LED streetlights equipped with sensors that can be dimmed automatically for precise ambient light conditions. While city officials haven’t said what they expect to spend on the expensive new LED lighting, Sensity has stated the city stands to save $4 million a year with the new approach.

KC station signMatt Hamblen
Kansas City’s 2.2-mile streetcar line, coming next year, sits in the center of an innovation district that will include smart city elements like free Wi-Fi, station interactive kiosks and sensors to guide traffic and control streetlights.

Sensity has big ambitions for the world’s billions of streetlights and has created technology called Light Sensory Networks that turns an LED streetlight into a platform for data and video for blossoming Internet of Things networks. Each LED street lamp can become a sensor-equipped smart device with a unique IP address to serve as a node in a broadband network, often wirelessly. That smart device can power other smart devices, like video sensors or Wi-Fi access points, to support parking, surveillance or industrial applications, such as systems that tell city snowplows when and where to salt or plow snow.

At CTIA Super Mobility Week 2015 in Las Vegas recently, Verizon showed a smart street lamp that was built by its partner Illuminating Concepts and is similar to those installed for a smart lighting project in Lansing, Mich. The streetlights are connected wirelessly to the cloud and can provide public announcements over audio speakers or via digital signs. They can also handle air pollution analysis and other functions. Each pole costs nearly $6,000, although pricing depends on the sensors installed and the functions the pole is used for.

In addition to Verizon, AT&T and other large U.S. wireless carriers have jumped on board the smart city movement. In Kansas City, Sprint recently invested $7 million for a free Wi-Fi zone around the coming 2.2-mile streetcar route.

Social scientists ponder the downside of the ‘smart city’

While the technology industry and city officials all over the world are promoting the various benefits that smart cities are expected to bring, at least two social scientists have recently raised concerns about the ways smart city technologies can be used to manipulate people with things like facial recognition systems and automated policing tools.

In a paper titled “The Spectrum of Control: A Social Theory of the Smart City,” Jathan Sadowski and Frank Pasquale called attention to some of the negative aspects of cities filled with networks of smart sensors.

“At present, smart city boosters are far too prone to assume that a benevolentintelligence animates the networks of sensors and control mechanisms they plan to install,” they wrote.

Both researchers are concerned that smart cities may feature networks that provide “little escape from a seamless web of surveillance.” That “web of surveillance” could clearly include facial recognition systems, but Sadowski and Pasquale argue that the potential to use technology to track people’s movements goes deeper — smartphones might be tracked via GPS or beacons, for example. Depending on the person using the technology, the collection of such information could be seen as beneficial or insidious.

“It is against [the] democratic egalitarian goal — of fair benefit- and burden-sharing — that alleged ‘smartenings’ of the city must be measured,” they conclude. Sadowski is a Ph.D. candidate at Arizona State University, and Pasquale is a law professor at the University of Maryland.

Other social scientists have raised similar red flags about smart city technologies, and officials in some cities have addressed citizens’ concerns that sensors and other smart systems could be used in a way that invades people’s privacy.

In Kansas City, the city council recently passed a resolution committing to follow data privacy best practices. The mayor also created a panel known as the Smart City Advisory Board to offer guidance on privacy concerns.

The nebulous smart city label

While Sadowski and Pasquale have joined a number of social commentators questioning where the smart city phenomenon is headed, they also condemned the broad way the term “smart city” has been defined.

“Major corporate players work hard to push smartness as an ideal and to pull city leaders and investors into the smartness orbit,” they state in their paper. “[They] have worked hard to create this market and to shape it in certain ways. Yet, with this massive growth and capital investment, the label ‘smart city’ is nebulous…. This ambiguity does a lot of work for smart city proponents and purveyors. The label…. [gives] them discursive cover in case they need to distance themselves if something goes wrong or doesn’t deliver on a promise.”

Smart city proponents, naturally, see things differently. They say it’s a little like the early days of the PC or the way that people first envisioned social networks like Facebook. A desktop computer was originally seen as a better tool for typing reports than an electric typewriter, but the machine later became the all-important, expansive portal to the Internet. And before Facebook exploded to global prominence, few could envision how important intimate mobile connections would one day be to millions of people.

“The exciting part is that we don’t know what we don’t know” about smart city technology, said Rick Usher, assistant city manager for Kansas City. Notice, he called it “exciting.”

 

You can find more at the original resource here.