What is cloud computing? 

Cloud Computing Diagram
Cloud Computing Diagram

Cloud computing is a service driven model for enabling ubiquitous, convenient, on demand network access to a shared pool computing resources that can be rapidly provisioned and released with minimal administrative effort or service provider interaction.

Related References

A Road Map For Migrating To A Public Cloud

Cloud Migration
Cloud Migration

A Road Map for Migrating to A Public Cloud Environment

Today, most organizations are looking for ways to cut down their sprawling IT budgets and define efficient paths for new developments. Making the move to the cloud is being seen as a more strategic and an economically viable idea, that is primarily allowing organizations to gain quick access to new platforms, services, and toolsets. But the migration of applications to the cloud environment needs a clear, and well-thought-out cloud migration strategy.

We are past the year of confusion and fear on matters cloud environment. In fact, almost everyone now agrees that the cloud is a key element of any company’s IT investment. What is not yet clear is what to move, how to move, and industry best practices to protect your investment in a public cloud environment. Therefore, a solid migration plan is an essential part of any cloud migration process.

Here are a few things you should pay close attention to when preparing a cloud migration planning template:

  • Data Protection

When planning to migrate to the cloud, it is paramount to remember that it is not a good idea to migrate every application. As you learn the baby steps, keep your legacy apps and other sensitive data such as private banking info off the cloud. This will ensure that, in case of a breach on your public cloud, your sensitive data and legacy systems will not fall into the hands of unsavory individuals.

  • Security

Security of the data being migrated to the cloud should be just as important as on the cloud. Any temporary storage locations used during the cloud migration process should be secure from unauthorized intrusions.

Although security can be hard to quantify, it is one of the key components and considerations of any cloud service. The very basic security responsibility includes getting it right around your password security. Remember that, while you can massively increase the security around your applications, it is practically very different to deal with on-cloud threats and breaches since you technically don’t own any of the cloud software.

Some of the security concerns that you’ll need to look into include:

  • Is your data securely transferred and stored in the cloud?
  • Besides the passwords, does your cloud provider offer some type of 2-factor authentication?
  • How else are the users authenticated?
  • Does your provider meet the industry’s regulatory requirements?
  • Backup and Disaster recovery strategies

A backup and disaster strategy ensure that your data will be protected in case of a disaster. These strategies are unique to every organization depending on its application needs and the relevance of those applications to their organization.

To devise a foolproof DR strategy, it is important to identify and prioritize applications and determine the downtime acceptable for each application, services, and data.

Some of the things to consider when engineering your backup and disaster recovery blueprint include:

  • Availability of sufficient bandwidth and network capacity to redirect all users in case of a disaster.
  • Amount of data that may require backup.
  • Type of data to be protected
  • How long can it take to restore your systems from the cloud?
  • Communications Capacity enablement

Migrating to a cloud environment should make your business more agile and responsive to the market. Therefore, a robust communications enablement should be provided. Ideally, your cloud provider should be able to provide you with a contact center, unified messaging, mobility, presence, and integration with other business applications.

While the level of sophistication and efficiency of on-premise communications platforms depends on the capabilities of the company IT’s staff, cloud environments should offer communication tools with higher customizations to increase productivity.

A highly customized remote communications enablement will allow your company to refocus its IT resources to new innovation, spur agility, cut down on hardware costs and allow for more engagements with partners and customers.

Simply put, cloud communications:

  • Increase efficiency and productivity
  • Enables reimagined experience
  • Are designed for a seamless interaction.
  • legal liability and protection

Other important considerations when developing your cloud migration planning template are compliance with regulatory requirements and software licensing. For many businesses, data protection and regulatory compliance with HIPPA and GDPR is a constant concern, especially when dealing with identifiable data. Getting this right, the first time will allow you to move past the compliance issue blissfully.

When migrating, look for a cloud provider with comprehensive security assurance programs and governance-focused features. This will help your business operate more secure and in line with industry standards.

Ready to migrate your processes to a public cloud environment? Follow these pointers develop a comprehensive cloud migration planning template.

Related References

Public Cloud Versus Private Cloud

Cloud Computing
Cloud Computing

A public cloud strategy refers to a situation where you utilize cloud resources on a shared platform. Examples of shared or public cloud solutions include Microsoft Azure, Amazon Web Services and Google cloud. There are several benefits associated with cloud solutions. On the other hand, a private cloud strategy refers to a situation where you can decide to have an infrastructure which is dedicated to serving your business. It is sometimes referred to as homegrown where you employ experts to run the services so that your business can access different features. There are several advantages of using a public cloud over private cloud which you should know before you make an informed decision on the right platform to invest. Some of the benefits of the public cloud strategy include the following:

Availability and scale of Expertise

If you compare the public cloud and the private cloud services, the public cloud

allows you to access more experts. Remember the companies which offer the cloud services have enough employees who are ready to help several clients. In most cases, the other clients whom the service providers serve will not experience problems at the same time. It implies that human resource will be directed toward solving your urgent issue. You can as well scale up or down at any given time as the need arises which is unlike a case of private cloud solutions where you will have to invest in infrastructure each time you will like to upgrade.

Downgrading on a private cloud system can expose you to lose because you will leave some resources underutilized.

The volume of Technical Resources to apply

You access more technical resources in a public cloud platform. Remember the companies which offer the public cloud solutions are fully equipped with highly experienced experts. They also have the necessary tools and resources which

they can apply to assure you the best technical solutions each time you need them. It is unlike a private arrangement where you will have to incur more costs if the technical challenges will need advanced tools and highly qualified experts.

Price point

The price of a private cloud is high when compared to a public arrangement. If you are looking for ways you can save money, then the best way to go about it is to involve a public cloud solution. In the shared platform, you will only pay for

what you need. If you do not need a lot of resources at a given time, you can downgrade the services and enjoy fair prices. Services such as AWS offer great cost containment across the time which makes it easy to access the services at fair prices. For any business to grow, it should invest in the right package which brings the return on investment. The services offered by the public cloud systems allow businesses to save and grow. You should as well take into consideration other factors such as ecosystems for cloud relationships before you make an informed decision. There are some business models which prefer private cloud solutions while others can work well under public cloud-based solutions.

Related References

Major Cloud Computing Models

Cloud Computing
Cloud Computing

Cloud computing enables convenient, ubiquitous, measures, and on-demand access to a shared pool of scalable and configurable resources, such as servers, applications, databases, networks, and other services. Also, these resources can be provisioned and released rapidly with minimum interaction and management from the provider.

The rapidly expanding technology is rife with obscure acronyms, with major ones being SaaS, PaaS, and IaaS. These acronyms distinguish the three major cloud computing models discussed in this article. Notably, cloud computing virtually meets any imaginable IT needs in diverse ways. In effect, the cloud computing models are necessary to show the role that a cloud service provides and how the function is accomplished. The three main cloud computing paradigms can be demonstrated on the diagram shown below.

The three major cloud computing models
The three major cloud computing models

Infrastructure as a Service (IaaS)

In infrastructure as a service model, the cloud provider offers a service that allows users to process, store, share, and user other fundamental computing resources to run their software, which can include operating systems and applications. In this case, a consumer has minimum control over the underlying cloud infrastructure, but has significant control over operating systems, deployed applications, storage, and some networking components, such as the host firewalls.

Based on its description, IaaS can be regarded as the lowest-level cloud service paradigm, and possibly the most crucial one. With this paradigm, a cloud vendor provides pre-configured computing resources to consumers via a virtual interface. From the definition, IaaS pertains underlying cloud infrastructure but does not include applications or an operating system. Implementation of the applications, operating system, and some network components, such as the host firewalls is left up to the end user. In other words, the role of the cloud provider is to enable access to the computing infrastructure necessary to drive and support their operating systems and application solutions.

In some cases, the IaaS model can provide extra storage for data backups, network bandwidth, or it can provide access to enhanced performance computing which was traditionally available using supercomputers. IaaS services are typically provided to users through an API or a dashboard.

Features of IaaS

  • Users transfer the cost of purchasing IT infrastructure to a cloud provider
  • Infrastructure offered to a consumer can be increased or reduced depending on business storage and processing needs
  • The consumer will be saved from challenges and costs of maintaining hardware
  • High availability of data is in the cloud
  • Administrative tasks are virtualized
  • IaaS is highly flexible compared to other models
  • Highly scalable and available
  • Permits consumers to focus on their core business and transfer critical IT roles to a cloud provider

Infrastructure as a Service (IaaS)
Infrastructure as a Service (IaaS)

IaaS Use Cases

A series of use cases can explore the above benefits and features afforded by IaaS. For instance, an organization that lacks the capital to own and manage their data centers can purchase an IaaS offering to achieve fast and affordable IT infrastructure for their business. Also, the IaaS can be expanded or terminated based on the consumer needs. Another set of companies that can deploy IaaS include traditional organizations seeking large computing power with low expenditure to run their workloads. IaaS model is also a good option for rapidly growing enterprises that avoid committing to specific hardware or software since their business needs are likely to evolve.

Popular IaaS Services

Major IT companies are offering popular IaaS services that are powering a significant portion of the Internet even without users realizing it.

Amazon EC2: Offers scalable and highly available computing capacity in the cloud. Allows users to develop and deploy applications rapidly without upfront investment in hardware

IBM’s SoftLayer: Cloud computing services offering a series of capabilities, such as computing, networking, security, storage, and so on, to enable faster and reliable application development. The solution features bare-metal, hypervisors, operating systems, database systems, and virtual servers for software developers.

NaviSite: offers application services, hosting, and managed cloud services for IT infrastructure

ComputeNext: the solution empowers internal business groups and development teams with DevOps productivity from a single API.

Platform as a Service (PaaS)

Platform as a service model involves the provision of capabilities that allow users to create their applications using programming languages, tools, services, and libraries owned and distributed by a cloud provider. In this case, the consumer has minimum control over the underlying cloud computing resources such as servers, storage, and operating system. However, the user has significant control over the applications developed and deployed on the PaaS service.

In PaaS, cloud computing is used to provide a platform for consumers to deploy while developing, initializing, implementing, and managing their application. This offering includes a base operating system and a suite of development tools and solutions. PaaS effectively eliminates the needs for consumers to purchase, implement and maintain the computing resources traditionally needed to build useful applications. Some people use the term ‘middleware’ to refer to PaaS model since the offering comfortably sits between SaaS and IaaS.

Features of PaaS

  • PaaS service offers a platform for development, tasking, and hosting tools for consumer applications
  • PaaS is highly scalable and available
  • Offer cost effective and simple way to develop and deploy applications
  • Users can focus on developing quality applications without worrying about the underlying IT infrastructure
  • Business policy automation
  • Many users can access a single development service or tool
  • Offers database and web services integration
  • Consumers have access to powerful and reliable server software, storage capabilities, operating systems, and information and application backup
  • Allows remote teams to collaborate, which improves employee productivity

Platform as a Service (PaaS)
Platform as a Service (PaaS)

PaaS Use Cases

Software development companies and other enterprises that want to implement agile development methods can explore PaaS capabilities in their business models. Many PaaS services can be used in application development. PaaS development tools and services are always updated and made available via the Internet to offer a simple way for businesses to develop, test, and prototype their software solutions. Since developers’ productivity is enhanced by allowing remote workers to collaborate, PaaS consumers can rapidly release applications and get feedback for improvement. PaaS has led to the emergence of the API economy in application development.

Popular PaaS Offerings

There exist major PaaS services that are helping organizations to streamline application development. PaaS offering is delivered over the Internet and allows developers to focus more on creating quality and highly functional application while not worrying about the operating system, storage, and other infrastructure.

Google’s App Engine: the solution allows developers to build scalable mobile and web backends in any language in the cloud. Users can bring their own language runtimes, third-party libraries, and frameworks

IBM BlueMix: this PaaS solution from IBM allows developers to avoid vendor lock-in and leverage the flexible and open cloud environment using diverse IBM tools, open technologies, and third-party libraries and frameworks.

Heroku: the solution provides companies with a platform where they can build, deliver, manage, and scale their applications while abstracting and bypassing computing infrastructure hassles

Apache Stratos: this PaaS offering offers enterprise-ready quality service, security, governance, and performance that allows development, modification, deployment, and distribution of applications.

Red Hat’s OpenShift: a container application platform that offers operations and development-centric tools for rapid application development, easy deployment, scalability, and long-term maintenance of applications

Software as a Service (SaaS)

Software as a service model involves the capabilities provided to users by using a cloud vendor’s application hosted and running on a cloud infrastructure. Such applications are conveniently accessible from different platforms and devices through a web browser, a thin client interface, or a program interface. In this model, the end user has minimum control of the underlying cloud-based computing resources, such as servers, operating system, or the application capabilities

SaaS can be described as software licensing and delivery paradigm that features a complete and functional software solutions provided to users on a metered and subscription basis. Since users access the application via browsers or thin client and program interfaces, SaaS makes the host operating system insignificant in the operation of the product. As mentioned, the service is metered. In this case, SaaS customers are billed based on their consumption, while others pay a flat monthly fee.

Features of SaaS

  • SaaS providers offer applications via subscription structure
  • User transfer the need to develop, install, manage, or upgrade applications to SaaS vendors
  • Applications and data is securely stored in the cloud
  • SaaS is easily managed from a central location
  • Remote serves are deployed to host the application
  • Users can access SaaS offering from any location with Internet access
  • On-premise hardware failure does not interfere with an application or cause data loss
  • Users can reduce or increase use of cloud-based resources depending on their processing and storage needs
  • Applications offered via SaaS model are accessible from any location and almost all Internet-enabled devices

Software as a Service (SaaS)
Software as a Service (SaaS)

SaaS Use Cases

SaaS use case is a typical use case for many companies seeking to benefit from quality application usage without the need to develop, maintain and upgrade the required components. Companies can acquire SaaS solutions for ERP, mail, office applications, collaboration tool, among others. SaaS is also crucial for small companies and startups that wish to launch e-commerce service rapidly but lack the time and resource to develop and maintain the software or buy servers for hosting the platform. SaaS is also used by companies with short-term projects that require collaboration from different members located remotely.

Popular SaaS Services

SaaS offerings are more widespread as compared to IaaS and PaaS. In fact, a majority of consumers use SaaS services without realizing it.

Office365: the cloud-based solution provides productivity software for subscribed consumers. Allows users to access Microsoft Office tools on various platforms, such as Android, MacOS, and Windows, etc.

Box: the SaaS offers secure file storage, sharing, and collaboration from any location and platform

Dropbox: modern application designed for collaboration and for creating, storing, and accessing files, docs, and folders.

Salesforce: the SaaS is among the leading customer relationship management platform that offers a series of capabilities for sales, marketing, service, and more.

Today, cloud computing models have revolutionized the way businesses deploy and manage computing resources and infrastructure. With the advent and evolution of the three major cloud computing models, that it IaaS, PaaS, and SaaS, consumers will find a suitable cloud offering that satisfies virtually all IT needs. These models’ capabilities coupled with competition from popular cloud computing service providers will continue availing IT solutions for consumers demanding for availability, enhanced performance, quality services, better coverage, and secure applications.

Consumers should review their business needs and do a cost-benefit analysis to approve the best model for their business. Also, consumers should conduct thorough workload assessment while migrating to a cloud service.

Common Information Technology Architectures

Overview Of Common Information Technology Architectures

The world is currently in the Information and Technology era, were as, so many experts are of the opinion that the Silicon Valley days are beginning to come to an end. Information and Technology is basically what the world revolves around today which makes it necessary to consider some technical overview of Information and Technology architecture use. The term Information Technology is often used in place for computer networks, and it also surrounds other information related technologies like television, cell phones and so on, showing the connection between IT and ICT (thou IT and ICT are often used to replace each other but technically are different). When talking about IT architectural, it is the framework or basis that supports an organization or system. Information technology architectural concerning computing involves virtual and physical resources supporting the collection, processing, analysis and storage of data. The architecture, in this case, can be integrated into a data center or in some other instances decentralized into multiple data centers, which can be managed and controlled by the IT department or third-party IT firm, just like cloud provider or colocation facility. IT architectures usually come into play when we consider hardware for computers (Big Iron: mainframe & Supercomputers), software, internet (LAN / WAN Server based), e-commerce, telecom equipment, storage (Cloud) and so on.

Information Technology Industry Overview
Information Technology Industry Overview

We human beings have been able to manipulate, store, and retrieve data since 3000Bc, but the modern sense of information technology first appeared in an article in 1958 published in a Havard Business Review: Harold j.Leavitt and Thomas L.whisler were the authors, and they further commented that the new technology was lacking an established name. It shall be called information technology (IT). Information Technology is used in virtually all sectors and industries, talking about education, agriculture, marketing, health, governance, finance and so on. Whatever you do, it is always appropriate to have a basic overview of the architectural uses of Information Technology. Now we take a look at some standard Information technology architectures use with regards to technology environment patterns such as Big Iron (mainframe & Supercomputers); Cloud; LAN / WAN Server based; storage (Cloud).

Big Iron (Mainframe & Supercomputers)

Big iron is a term used by hackers, and as defined in the hacker’s dictionary the Jargon File refers to it as “large, expensive, ultra-fast computers. It is used for number crunching supercomputers such as Crays, but can include more conventional big commercial mainframes”. Often used concerning IBM mainframes, when discussing their survival after the invention of lower cost Unix computing systems. More recently the term also applies to highly efficient computer servers and ranches, whose steel racks naturally work in the same manner.

Supercomputers are known to be the world’s fastest and largest computers, and they are primarily used for complex scientific calculations. There are similar components in a supercomputer and desktop computer: they both have memory processors and hard-drives. Although similarities exist between supercomputers and desktop computers, the speeds are significantly different. Supercomputers are way faster and more extensive. The supercomputers large disk storage, high memory, and processors increase the speed and the power of the machine. Although desktop computers can perform thousands or millions of floating-point operations per second know as (megaflops), supercomputers speeds perform at billions of operations per second also known as (gigaflops) and even up to trillions of operations per second know as (teraflops).

Mainframe Computers
Mainframe Computers

Evolution Of Mainframe and Supercomputers

Currently, many computers are indeed faster than the very first supercomputer, the Cray-1, which is designed and developed by Cray Research team during the mid-70s. The Cray-1 had the capacity of computing at the rate of 167 megaflops using a rapid form of computing called the Vector Processing,   which is composed of quick execution of instructions in a state of pipelined fashion. In the mid-80s a faster method of supercomputing was originated: which was called Parallel Processing.  Applications that made use of parallel processing were and are still able to solve computational issues by using multiple processors. Example: if you were going to prepare ice cream, sundaes for nine of your friends. You would need ten scoops of ice cream, ten bowls; ten drizzles of chocolate syrup with ten cherries, working alone you would put one scoop of ice-cream in each bowl and drizzle the syrup on each other. Now, this method of preparing sundaes will be categorized as vector processing. To get the job done very quickly, you will need help from some friends to assist you in a parallel processing method. If five people prepare the ice-cream mixture, it would be five times as fast.

Parallel Processing
Parallel Processing

Application Of Mainframe and Supercomputers

Supercomputers are very powerful that they can provide researchers with the insight into sources that are small, too fast, too big, or maybe very slow to observe in laboratories. Astrophysicists make use of supercomputers as time machines to explore the past and the future of the universe. A fascinating supercomputer simulation was created in the year 2000 that was able to depict the collision of two galaxies: The Andromeda and our very own Milky Way, although this collision will not happen in another 3 billion years from now.

This particular simulation allowed scientist to experiment and the view the result now. The simulation was conducted by Blue Horizon, a parallel supercomputer in the Diego, Supercomputer Center. Using 256 of Blue Horizon’s 1,152 processors, the simulation showed what would happen to millions of stars if the galaxies collided. Another example is molecular dynamic (molecular interactions with each other). Simulation events done with supercomputers allow scientists to study their interactions when two molecules are docked down. Researchers can generate an atom-by-atom picture of the molecular geometry by determining the shape of a molecule’s surface. Atomic experimentation at this level is extremely difficult or impossible to perform in a laboratory environment, but supercomputers have paved the way for scientists to stimulate such behaviors with ease.

Supercomputers Of The Future

Various research centers are always diving into new applications such as data mining to explore additional and multiple uses of supercomputing. Data mining allows scientist to find previously unknown relationships among data, just like the Protein Data Bank at San Diego Supercomputer Center is collecting scientific data that provides other scientists all around the world with more significant ways of understanding of biological systems. So this will provide researchers with new and unlimited insights of the effects, causes, and treatments of so many diseases. Capabilities of and applications of supercomputers will continue to grow as institutions all over the world are willing to share their various discoveries making researchers more proficient at parallel processing.

information technology Data Storage

Electronic data storage, which is widely used in modern computers today, has a date that spans from World War II when a delay memory line was developed to remove the interference from radar signals. We also have the William tube, which was the very first random-access digital storage, based on the cathode ray tube which consumed more electrical power. The problem regarding this was that the information stored in the delay line memory was liable to change flexibly and fast, especially very volatile. So it had to be continuously refreshed, and information was lost whenever power was removed. The first form of non-volatile computer storage system was the magnetic drum, which was the magnetic drum, it was invented in 1932 and used in the (Ferranti Mark 1) the very first commercially available electronic that was for general-purpose.

IBM initially introduced the very first hard disk drive in 1956, as an added component to their 305 RAMAC computer system. Most digitalized data today are stored magnetically on a hard disk, or optically such as CD-ROMS. But in 2002 the digital storage capacity exceeded analog for the first time. In the year 2007, almost 94% of data stored in the world was digitally held: 28% optical devices, 52% hard disks, 11% digital magnetic tape. The worldwide capacity for storing information on electronic devices grew from 3 Exabyte (1986) to 295 Exabyte (2007), doubling every three years. 

Cloud Computing
Cloud Computing

Cloud Storage

Cloud storage is a modern data storage system in which the digital data is stored in an array of logical pools, the physical storage system composes of multiple servers and often various locations, and the environment is usually owned by and managed by a hosting company. Cloud storage supplying companies are in charge of for keeping the data available and accessible, individuals; organizations lease or buy storage capacity from the suppliers to store user, application data or organization. Cloud storage refers to a hosted object-storage service, I a long run the term has broadened to include other sources of data storage systems that are available as a service, just like extended storage.  Examples of block storage services are Amazon S3 and Microsoft Azure storage. Then we also have OceanStore and VISION cloud which are storage systems that can be hosted and also deployed with cloud characteristics.

Cloud computing is changing implementation and design of IT infrastructures. Typically, business-owned traditional database centers are mostly private, and capital-intensive resources (Big-Iron: Mainframe and supercomputers), cloud base computing, on the other hand, enables organizations to have access to cloud base service providers with credible data center infrastructure for a mostly avoidable fee. Infrastructure-as-a-service model, cloud computing, allows flexible data storage on demand. Consumers can beseech cloud service provider’s to help store, compute, and offer other IT related services without installing gadgets and other resources locally, saving a lot of space and money while users can quickly adjust cloud base usage depending on required workload.

Servers

On a typical day, people tend to use different IT-based servers or networks. Firstly, the process of checking your email, over a Wi-Fi connection on your PC, in your house, is a typical server.

The process of logging on to your computer at your place of work, to have access to files from the company’s database that is another typical server. When you are out for coffee the Wi-Fi hotspot at the coffee shop, is another type of server-based communications.

All of these typical servers are set up differently. Servers are mainly categorized according to a geographic area of use and the requirements of the server within those geographic areas. Servers can service just about anyone from one man usage within with one device to millions of people and devices anywhere on the planet.

Some Common Servers we will consider Include:

  • WAN (Wide Area Network)
  • LAN (Local Area Network)
  • PAN (Personal Area Network)
  • MAN (Metropolitan Area Network)

Let’s go into some detail on these networks.

Area Net Relative Size Relationship
Area Net Relative Size Relationship

PAN (Personal Area Network)

PAN (personal area network), is a server integrated for a single person within a building or nearby. It could be inside a little office or a home. A PAN could incorporate at least one PC, phones, minor gadgets, computer game consoles and other gadgets. On the off chance that various people utilize a similar system inside a home, the system is some of the time known as a HAN (Home Area Network).

In an exceptionally common setup, a home will have a single, wired Internet connection associated with a modem. This modem at that point gives both wired and remote service for numerous gadgets. The system is regularly managed from a PC yet can be accessed to from other electronic devices.

This kind of server gives incredible adaptability. For instance, it enables you to:

  • Send a report to the printer in the workplace upstairs while you’re perched in another room with your portable workstation
  • Upload the pictures from your mobile phone to storage device (cloud) associated with your desktop PC
  • View movies from an internet streaming platform on your TV

If this sounds well-known to you, you likely have a PAN in your home without you knowing what it’s called.

LAN (Local Area Network)

LAN (Local Area Network) comprises of a PC network at a single location, regularly an individual office building. LAN is useful for sharing assets, for example, information stockpiling and printers. LANs can be worked with generally modest equipment, for example, network connectors, hubs, and Ethernet links.

A small LAN server may just utilize two PCs, while bigger LANs can oblige a higher number of PCs. A LAN depends on wired networking for speed and security optimization; however wireless networks can be associated with a LAN. Fast speed and moderately low cost are the qualifying attributes of LANs.

LANs are regularly utilized for a place where individuals need to share resources and information among themselves yet not with the outside world. Think about an office building where everyone ought to have the capacity to get to records on the server or have the ability to print an archive to at least one printer. Those assignments ought to be simple for everyone working in a similar office, yet you would not want someone strolling into the office and have access.

 

MAN (Metropolitan Area Network)

MAN (metropolitan area network) comprises of a PC organize over a whole city, school grounds or little district. Contingent upon the arrangement, this kind of system can cover a range from 5 to around 50 kilometers over. A MAN is often used to associate a group of LANs together to form a broader system. When this kind of server is mainly intended for a campus, it can be called CAN (Campus Area Network).

WAN (Wide Area Network)

WAN (wide area network), involves a vast region, for example, a whole nation or the entire world. A WAN can contain various littler systems, for example, LANs or MANs. The Internet is the best-known case of an open WAN.

Conclusion

The world is changing rapidly as modern world continues its unstoppable growth. With so much of the changes happening its good education be capable of touching students in various ways. Students today are leaders, teacher’s inventors and businessmen and women of tomorrow. Information technology has a crucial role in students being able to retain their job and go to school. Especially now that most schools offer various online courses, classes that can be accessed on tablets laptops and mobile phones.

Information technology is reshaping many aspects of the world’s economies, governments, and societies.  IT provide more efficient services, catalyze economic growth, and strengthen social networks, with about 95% of the world’s population now living in an area with the presence of a featured use and implementation of IT. IT is diversified, what you are probably using to have access to this article is based on IT architectural features. Technological advancement is a positive force behind growth in economies of nations, citizen engagement, and job creation.

IBM Db2 on Cloud, IBM Db2 Warehouse, IBM Db2 Warehouse on Cloud (Previously IBM dashDB), and IBM Integrated Analytics System – Useful links

Documentation
Documentation

Here are a few references for IBM Db2 on Cloud, IBM Db2 Warehouse, IBM Db2 Warehouse on Cloud (Previously IBM dashDB), and IBM Integrated Analytics System – Useful links, which hopefully will be helpful.

 Table Of Useful IBM Db2 on Cloud, IBM Db2 Warehouse, IBM Db2 Warehouse on Cloud (Previously IBM dashDB), and IBM Integrated Analytics System links

 SQL Reference > Statementshttps://www.ibm.com/support/knowledgecenter/SS6NHC/com.ibm.swg.im.dashdb.sql.ref.doc/doc/r0011049.html

Installing the Db2 driver package

https://www.ibm.com/support/knowledgecenter/SS6NHC/com.ibm.swg.im.dashdb.doc/connecting/connect_driver_package_install.html

Related References