Modified by pentago
It's no secret that online shopping has driven up the competition and is making it possible for people to find cheap deals on everything, and it seems this effect has trickled down into the car insurance industry as well. Prior to the advent of the many insurance comparison sites that you can now find online, a consumer would typically meet with a local agent or call a hotline to discuss their options. Nowadays, that process is streamlined and there's much more transparency, making it easier than ever for people to find the absolute lowest price at any time.
On Demand 24 Hour Comparisons Worldwide
In the dark ages before the dawn of the great internet, there were hours when it was impossible to compare all available care insurance plans. Now you can visit any number of car insurance comparison sites at any hour of the day or night, regardless of where in the world you are or which language you speak. For example, Comparaencasa is a popular insurance comparison site for Spanish speakers. This makes the comparison process less urgent and therefore takes away the exclusivity factor that used to pressure consumers into buying overpriced plans.
Easier Access to Information
In addition to providing a number of comparison platforms, the internet is also an invaluable research tool that helps build more savvy consumers in every industry. People no longer get trapped into horrible contracts en masse, because most of us take the time look up our options online before making a significant commitment, like which car insurance provider we're about to be using for the next 6 months to a year or longer. Furthermore, regulatory authorities have taken advantage of this transparency to place limits on the amounts that insurance companies can charge policyholders, so the likelihood of being the victim of price gouging in this industry has gone down significantly in today's internet-centric era.
Mobile Apps and Text Notifications
Mobile apps and online communication are two of the best things that the internet has ever spawned, and together they make managing and comparing insurance plans much easier. Instead of having to call a hotline and speak to an agent about a status or to answer a specific question, consumers can now depend on simple text notifications and mobile app interfaces that provide access to critical account information and FAQ sections. It's even possible to open support tickets and handle inquiries from the comfort of your smartphone thanks to the gigantic network of computers we know as the world wide web.
Using the Available Knowledge and Tools to Your Advantage
What good are all of the advancements mentioned above if you don't put them to good use. Unfortunately, just because it's now possible to compare a multitude of options, there are still many people who don't want to spend the time or effort and would rather just opt for the most popular brand they can think of offhand. While that will still get you car insurance, and usually at a decent price, if you're the type who likes to shop around you should be using the single greatest shopping resource on the planet – the internet.
Modified on by pentago
For every successful app on the market, there are thousands of services and products that simply could not make it. So what’s the secret of a popular app? What makes some app developers struggling to increase their income, and others to make millions instantly? Of course, it’s well worth outsourcing to the app developer with experience and knowhow. One more reason is to understand the monetization strategy behind your app as early as you can. In tandem, the best app developer and great monetization strategy will significantly increase your chances for a successful app.
The vast majority of apps that you can find on App Store or Google Play are free to use. It means that although your initial goal of making your app popular may be accomplished, you are still not making money on it, or at least not as much as you wanted. Pokemon Go is a brand new viral mobile game that can be a great example of how a hip app that is free-of-charge can make a lot of money for its owners as well many other businesses. This mobile phone game is based on augmented reality (AR) technology that employs real world location via your phone’s camera to show virtual Pokemon on a display.
The economics behind Pokémon Go app is free-to-download game. By offering a free app you can easily raise the number of people who use it and create a buzz. For instance, according to Forbes - HootSuite (freemium product), you’ll find 10M results. If you search for Sendible (premium only competitor), you’ll find 160K results. Similarly, if you search for mentions of MailChimp (freemium product), you’ll find 10M results. If you search for Aweber (premium only competitor), you’ll find 718K results. Despite that not every single aspect of such game is completely accessible without paying. It offers you to make some purchases during the game like in-game currency PokéCoins that players use to buy helpful items, such as Poké Balls in order to “catch” Pokémon, and for inventory upgrades. The key issue here is to set up proper and well-balanced incentives for these in-app purchases, as it may become a viable strategy to drive revenue. There are several ways to implement in-app purchases into free-to-download app: user can buy additional time or items in games, functionality or general usage. Worldwide revenue for in-app purchases is expected to hit about $58 billion by the end of 2016 and to skyrocket to $76.5 billion in 2017 according to Statista portal.
Companies widely employ geolocation as an advertising technology, meaning when you walk by a partner business or you are located in a particular part of the city, your app users will see a unique promo message. This is applicable to any kind of app, either a game or business tool. Wouldn’t it be great that your clients receive a pop-up saying that your partner is nearby and can offer his services to your user or a shop nearby offers a 10% discount to those using your app? You can follow suit and use geolocation in your app to provide bonuses or coupons when people enter your partner business.
According to The App Solutions, a mobile app development company, location-based advertising is one of ‘most exciting’ mobile opportunities now because of games like Pokémon Go, location-based ads should get a boost in augmented-reality gaming in 2016 and further on. The data by The AppSolutions shows that that 8 of 10 consumers prefer using their smartphones for making a final decision before purchasing, so it seems that it is worth to consider mobile geolocation technology to drive revenues.
Another way Niantic Inc, the developer of the app, can make money is by creating a companion device that improves the gamers’ experience. If you choose to follow this path to promote your app, then this device, like the Pokemon Go Plus, should be customizable and branded with your app logo for it to be easily recognized. Such feature lets your active app users improve their overall experience and keep using it even more constantly. Many other companies like Fitbit and Nike have used this before, but vice versa. They created a device followed by an app to improve user experience. Whatever way you choose, there is money to be made.
You can also partner up with existing businesses to use their technology or hardware to accommodate your app. Mobile entertainment has always been attached to the mobile devices. But if you look at recent development with the Samsung Gear or Apple Watch, for instance, their wearable technology magnifies the value of the apps on their phones dramatically. So feel free to find a hardware provider you can partner up with and use their wearable tech to improve user experience and make extra money.
Paid app & real life PokeSpots
One more option to monetize your app is to create an app for the app, however crazy it may sound. With Pokemon Go, this task is not that easy like with any of your average apps because it uses the new AR tech. In order to succeed in this game, you have to move around a lot to find your next Pokemon or PokeSpot. The app on its own does not have a map detailing these locations, but there have sprung apps that actually offer a detailed map for a small fee. Creating a paid app for a free app to improve user experience is something not that new but easy to implement.
While using Pokemon Go AR you have to actually move around a lot and visit physical spaces that can be a café, McDonalds, restaurant or mall. Taking into account the viral nature of this app many businesses can order from Niantic Inc. a training ground that can be located on their premises. This draws people who are playing the game in and they can make purchases on the go, or can stay on and battle it out with others on their mobile phones. Sales of snacks, coffee, branded cups and other merchandise can be greatly boosted with such an option.
What Niantic Inc and their partners do is not game changing. They use up-to-date viral technology to rake in profits, both passively (by increases in stock prices) and actively (by engaging various revenue harvesting techniques). Most importantly they created a franchise that is a win-win situation for the app owner and their partners, small and large businesses that increase their social presence in their neighborhood or sell hardware to your clients.
This game in its essence is pretty repetitive, so the company will be introducing new features and levels along the way. The same can be true for your app, you don’t need to launch all the monetization methoods instantly – choose the simplest tools and grow along the way.
Utilizing the latest technology like AR combined with different ways of monetization of your app seems to be one of the ways that lead to powerful and accelerating business. Alas, there is no ABC way to do that. Hopefully, the above advices will give you some ground to think about.
The development of technology is literally changing the face of today’s recruiting. Gone are the days of long, drawn-out recruiting processes done over the phone or even solely in person. Today’s recruiting is becoming even faster due in large part to newly developed recruiting apps. So, how do they work and are the actually effective? Today’s companies are turning more and more to recruiting apps to find candidates who are solid matches for their companies, and here is why.
Recruitment apps function in two ways. First, the apps present information about different available candidates to the recruiters. Second, recruitment apps provide information about open positions to prospects. So, in essence, recruitment apps help employers and candidates find one another and make connections. With more than 65 percent of today’s applicants searching for jobs on mobile devices in particular, the support for the further development of these apps is overwhelming.
Why Use an App
To sum it up, these apps simplify the process for recruiters, which enables them to explore a broader, more diverse pool of talent. Likewise, apps streamline the process by enabling companies to skip past the middlemen or headhunters and going to candidates directly. Likewise, apps don’t function based on appointments, so both applicants and recruiters have access to crucial information at any given time, in any place. Additionally, apps are more effective at identifying quality candidates compared to traditional methods for hiring.
How to Effectively Use the Apps
As companies constantly fight to control the current market, many are taking varied approaches to the use and development of their apps. To most effectively incorporate these apps into any hiring system, it’s important to have a little savvy with this technology. The following are a few simple tips to leverage the current market.
1. Understand the process
Knowing the ins and outs of both sides of the process helps recruiters leverage apps more effectively. It is not enough to only use an interface from the recruiting side. Recruiters also need to know how these apps function from the job seekers side to be able to provide the most beneficial information and experience for users. When used to its full extent, this technology can enable recruiters to locate and reach out to candidates in the area providing an authentic connection, such as meeting for lunch, coffee, or drinks.
2. Create a memorable interface and “storefront”
What the user sees is what will attract or detract him or her from your business as a potential candidate. Making sure the interface you use is memorable, easy-to-use, and attractive makes all the difference for attracting new potentials. The interface you utilize as a recruiter must be intuitive for your users and should present a clean, clear picture of what the opportunity looks like for them. Picture the app and the information presented in it as your “storefront,” and make sure it is attractive and clean for users.
3. Make it as direct and relevant as possible
If you’re working in a niche market, make sure the information you are providing and the app you are using reflects that. Consider enabling video posts from your candidates to test their knowledge in the specialized area you are looking for. You may even want to share pseudo-interview questions to get a better feel for your candidates from their responses. Doing so will help you get the best possible connections with your potential candidates.
4. Embrace the social media trend
So today’s market is one where social media is king. Make your company and its opportunities stand out by embracing the trends. Post customized recruitment advertisements through photographs on social media. Or share short videos to intrigue the younger demographic. The more creative you are with your recruiting techniques, the more creative and talented prospects you’ll attract.
The search for talent is literally unending, but that doesn’t mean the process isn’t changing. Today’s trend is toward recruiting apps, and using recruitment software effectively is the best way you can attract high-quality candidates.
Modified on by pentago
With PayPal being so ubiquitous nowadays it can be easy to think that you don't need to accept any other forms of payment online. After all, doesn't everyone have a PayPal account? Not quite. The last time we checked, while PayPal is certainly the most popular online payment processor, they haven't exactly taken over the internet yet. As we explain below, there are a variety of reasons why every online store should accept more than just PayPal:
Every Legit Business Should Be Able to Accept Credit and Debit Cards
While it is true that most major online stores accept PayPal, they also accept a variety of other payment processors like Google Wallet, Apple Pay, and the traditional route of paying by credit/debit card. For this reason, most e-commerce platforms make it possible to accept credit cards as a standard feature. As more people join the ranks of online shopping you're likely to encounter customers who've hardly even heard of PayPal. Imagine being in line in a physical store that only accepts PayPal yet doesn't give any warning – how many customers do you think they'd have to turn away at the register?
Everybody Doesn't Have a PayPal Account
According to PayPal's own statistics, there about 188 million active PayPal accounts worldwide as of the second quarter of 2016. That seems like a lot, but let's put it into perspective. There are about 318 million people in the U.S. alone, and according to Mintel's Online Shopping US 2015 report, about 70 percent of that population (about 222 million people) shops online at least once a month. Thus, even if PayPal were a U.S-only company (and it's certainly not) there would still be 34 million online shoppers without a PayPal account. Scale that up to a global level and it's not unreasonable to estimate that there are hundreds of millions of online shoppers that don't even have PayPal accounts. That's not too surprising since most sites accept debit and credit cards.
Some People Strongly Dislike PayPal
Not only are there plenty of people who don't have a PayPal account, there are some people that vehemently despise PayPal and would be strongly against shopping at any online store that only accepts their least favorite form of payment. Out of the millions of people who use PayPal, tens of thousands have had bad experiences, and while that's still a decent rate of customer satisfaction, that's also a pretty big chunk of e-commerce that you don't want to be turning away by default.
Making it Easy for Customers to Give You Money
In closing, why should you make it difficult for people to buy things from your company? In fact, that's the silliest thing a growing business can do. Here you have a customer trying to make a purchase and you're unable to profit from it because you were either too lazy or too committed to one payment method to compromise. Overall, the goal should be to ensure that every customer who makes it to the checkout screen is able to get their money into your account one way or another.
Modified on by pentago
One of the latest and biggest innovations in the manufacturing sector has everything to do with a technology called 3D printing. Originally developed by the Massachusetts Institute of Technology in the early 1990s, 3D printing is derived from an earlier form of additive manufacturing called rapid prototyping. It basically works by melting different types of plastic filament to create solid objects in several steps. Any object fit for 3D printing first starts out as a CAD (Computer-Aided Design) file created through various 3D modeling software programs such as TurboCAD or Trimble. The software slices the digital design into many layers and sends them out to the 3D printing machine, which prints them out and assembles them in real time.
The great thing about 3D printing lies in the various uses and applications that this still-emerging technology will soon be able to fill. Even at this early stage, however, 3D printing is responsible for a great many wonderful innovations in the tech sector, for example. Here are just some of the ideas that have been brought to life by the magic of 3D printing:
The mainstream automotive industry has so far taken a liking to the idea of 3D printing, especially when it comes to prototyping parts for their vehicles. General Motors, for example, has been using the technology to generate parts that can improve their 2014 Chevrolet Malibu. Ford has also gotten in on the game, with 3D-printed prototypes of many parts in their vehicles, including brake rotors, shift knobs and cylinder heads. But it's small independent companies like Kor Ecologic that have really made strides in this department, including designing and assembling an entire functional vehicle called the Urbee 2.
2. Consumer electronics
With the advent of 3D printing technology, the future of consumer electronics may very well lie in customization and self-innovation. Already, the PolyJet technology can lead to the production models with very thin walls of 0.6mm or less, which are ideal for designing small devices with densely-packed miniature components. Ever since 2013, when the first 3D printed consumer electronic device was created in the form of a fully functional loudspeaker, the technology has rapidly advanced until today, when it seems like it's only a matter of time before 3D printed smartphones become a reality.
3. Toys and models
One of the first things that 3D printing has proven useful for is to create real-life models of fictional characters. Popular with children and adult enthusiasts alike, such models can already be designed and ordered online, but industry giant Mattel has already taken things to the next level with its ThingMaker. This 3D printer works with both an iOS and an Android app, letting its customers design new figurines starting from a few basic blueprints that can then be extensively customized according to the user's imagination. You can also use this technology to create 3D models of 2D photos, thereby bringing to life an old memory or an unforgettable piece of imagery.
4. Airplane parts
One of the more spectacular applications of 3D printing technology is happening right now in the aerospace industry. None other than NASA itself has taken to designing a 3D-printed rocket engine injector that generates 10 times more thrust while still being stable enough to pass a hot fire test. And, with the first 3D printer already serving in space as a mini-factory for the International Space Station, it's safe to say that 3D printing's future is in good hands.
5. Special effects
Finally, another form in which 3D printing has already begun having an effect on our lives is through movies. Rather than building models by hand, filmmakers now turn to 3D printing in order to handle the massive workload of designing various costumes and outfits. Top-notch studios such as Legacy Effects have employed this technique to great effect in Hollywood blockbusters like Iron-Man 2, prompting a renaissance for practical special effects that don't rely as much on CGI as they do on technology-aided craftsmanship.
All in all, there's no doubt that the impact of 3D printing has only begun to be felt throughout the world. As the technology continues to develop and expand in popularity, its still-high costs will continue to decrease, gradually turning 3D printing into a technology that anybody can use. These are the makings of a true revolution that will ultimately have an incredible impact on life as we know it.
Modified on by pentago
Regardless of your company's size or the industry you operate in, you can't get around the fact that a robust communication and collaboration platform is needed. Many businesses use project management interfaces to complement basic messaging and calling apps (Skype obviously being the most popular). However, an alternative called Slack has managed to triple its user base within the past year, reaching an impressive 2.7 million daily active users (DAU) practically at the speed of light. Surely there must be some tangible advantages that are sparking such a rapid growth rate? If you're wondering how Slack is better than Skype, or vice versa, here's a brief comparison of the two:
Cost and Functionality
In terms of cost, Skype is hard to beat, being that it is always free to use as a messaging app and only charges for voice calling credits. Slack also has a free version, but it has some limitations feature-wise, whereas the free version of Skype is fully functional other than not being able to make calls. With that said, Slack can do a lot more than Skype altogether, which is probably why about 800,000 of its users have opted for the paid version. In terms of simplicity and learning curve, most people are more familiar with Skype's interface, but reading a simple guide on how to use Slack should bring anyone up to par quickly.
Compatibility and Integration
Slack wins in this department hands down, with the ability to allow other software to post messages to its interface. You can even create custom integrations to link the software with any program you'd like. In terms of bringing everything together under one roof, Slack takes the cake, as it even integrates with Skype and Google Hangouts for voice calling functionality. Of course, this year Slack added built-in voice chat and video calls, so you no longer need Skype integration to make voice calls.
Interface and Features
There's no question that Slack has a larger feature set than Skype – another reason why it is now being compared to such an established app. The bottom line is, you can do more with Slack than you can with Skype, as the latter is purely a chat/calling app, whereas Slack is a complete communication, storage, and collaboration solution that integrates with all of your major services like Google Drive, Dropbox, GitHub, Trello and more.
Ultimately, it all depends on what you're looking for and how much your business needs to operate efficiently. If you're just looking for a basic messaging and calling app and have no need for additional functionality beyond that, you can always start with Skype as a preliminary solution for your business, and then as your needs expand try a more inclusive, overarching approach with Slack. Fortunately, making the transition from Skype to Slack is typically easy thanks to streamlined integration that basically lets you use Skype features within Slack. Which one do you prefer?
There is no doubt that technology has changed a lot in the way we do most of our day to day tasks. Learning has not been spared and in an effort to elevate our learning experience to even greater heights we may have encountered many new gadgets and software platforms which boast of cutting edge technologies and advanced tools to deliver excellent results. The outcome is that we now have many eLearning platforms which many people do find to be very interesting. Streamlining of learning goals and objectives have now been made easier as it is now very simple to obtain course learning blocks which we could only imagine about a decade ago. Taking into consideration that technology innovation never seems to be having any kind of break, the year 2016 has presented us with many eLearning technologies and we find the following 5 picks to be really unique.
Cloud Platforms for Easy Access and Storage of ELearning Materials
Cloud platforms do provide a perfect environment for any eLearning experience. Nowadays many people do prefer to move most of their digital data into cloud and it is usually for a good reason. Cloud-based eLearning tools are highly favoured in that they enable for remote access of the given data which might be of interest. Besides this, it is also possible to form highly effective collaborations with other members who may be sharing a similar eLearning cloud-based platform and this helps to add more to the general learning experience. The other advantage is that many organizations now prefer to have most of their digital content stored in the cloud-based tools and the benefit that arises from this kind of move is that we can see massive cost-savings and productivity boost when it comes to online training and learning.
Wearable Tech Gadgets
Wearable technologies have found most of their use in aspects such as monitoring health fitness and gaming technologies. The eLearning is one platform which can also greatly benefit from the unique advantages those wearable technologies offers. For instance, wearable tech gadgets like smart watches allows for quick online access of eLearning resources like training modules, interactive modules, or access to an online scenario as dictated by the eLearning platform being used. Besides this, wearable technologies make it possible for the eLearning resources to be moved to wherever a person may be. This way it is possible for an individual to gain the relevant skills and training without being confined to a particular location. It is also possible to narrow down any eLearning experience to a specific geographical location and this ensures that those who have enrolled to a particular eLearning resource have the chance of receiving materials which are culturally and socially appropriate within a given location.
Virtual Reality Headsets and Glasses
Virtual Reality has been a hot topic for discussion especially in the gaming industry. We have had many companies giving out test trails for virtual reality gadgets and the immersion concept that these gadgets are operating on has been likened to something out of many sci-fi movies. This concept has however transformed and many people see the opportunity of it being used for applications like creating complex engineering designs or simulating real-life experiences which would otherwise prove to be dangerous if tested directly. The VR concept can make eLearning to be even more interesting as teaching presentations made using this concept usually make learners to be even more appreciative of what is being delivered.
Automated Development Platforms
ELearning does require significant investment both in terms of times and money especially of the learning objective is to have some element of interaction. In 2016, we have seen many automated development platforms being developed and most of these are offering pre-simulated templates, graphics and interactions which correspond to the learning topics of interest. This aspect helps in cutting down the development time and creates an environment where any interaction process required for eLearning gets to be automated.
For best results any learning experience has to be reinforced with elements like discussion. This is one area where training telepresence has been of great help and this technology works by providing more of a social gathering for members who sharing a similar eLearning platform. Leraners from any part of the world can engage in online discussions and collaboratively engage with one another without having any kind of geographical limitation being imposed. Training telepresence makes use of high definition cameras, audio materials and simulated space which makes learners feel like they are actually sharing a common physical space. Some researchers have suggested that eLearning can get even better if training telepresence is combined with some elements of virtual reality. This will help create a highly immersive environment which can make any online discussions to be worthy and thus add more the overall objective of any eLearning.
Learning Management System (LMS)
An LMS offers a unique advantage in that it is able to meet the eLearning needs of many government and enterprise organisations. A learning management system like LearnFlex is an excellent platform when it comes to flexibility, adaptability, scalability and overall effectiveness.
Not many years ago, information technology programs across Australia showed booming registration numbers. Those numbers have been declining in recent years according to the findings in a report sponsored by the Australian Bureau of Statistics. At the same time, the number of people who are enrolling in health care related courses is steadily on the rise. Why are fewer people choosing to study information technology in Australia?
Several factors may be behind the decline in IT enrollments. Foremost among them is the perception that technology simply evolves too quickly for university programs to be able to keep up. The information found in textbooks becomes outdated as soon as those books go to print. Moreover, there may be a prevailing impression that instructors who are not actively working in the IT field cannot possibly have a grasp on the latest developments. Many prospective students may believe that they can do better by obtaining on-the-job experience rather than delaying employment in favor of more education.
Declining student enrollment numbers in information technology courses may also be attributed to the impression that there are not enough jobs available in this sector. That's largely because students think that the majority of IT-related jobs are being farmed out elsewhere. Studies show that many IT jobs have been sent to China, India or elsewhere. Labor is far cheaper in these nations, making this an attractive option for many organisations. They may also experience a reduction in operating costs and an increase in productivity.
While it's true that a growing number of computer science jobs have gone overseas, demand for information technology professionals remains steady. That's because there are down sides to outsourcing information technology jobs. The most frequently cited problems are related to security and a loss of control over information and operations. Many companies have also discovered that the quality of work produced is frequently inferior when compared to work by local employees. This means that Australian companies are likely to continue to look for local graduates to fill open positions.
Experts also suggest that while young people are enthusiastic about technology, they are not generally interested in pursuing it as a career. In short, they love what technology enables them to do, but that doesn't inspire them to make information technology their career path.
One of the reasons why students may be staying away from computer science and other related courses is the impression that it is an exceptionally difficult field of study. Many students who take an introductory programming course are put off by the rigid syntax and the unfamiliarity of the structure. They labor intensively to bring about even the simplest of results. The path is easier for students who have already been exposed to programming languages in primary or secondary education. Unfortunately, there are not a great many options available for this subject matter at these levels. When students encounter such difficulty with learning their first programming language, there is a tendency for them to drop out of technology majors in favor of something entirely different.
The U.S. is experiencing a similar decline in information technology program enrollment. Students there seem to cite many of the same reasons for shunning IT degree programs that are given in Australia. However, there are those who argue that the decline and resurgence of enrollment in computer science courses is naturally cyclical in nature. There was a peak in computer science students in 1985, but those numbers declined through the 1990s, increased in the early 2000s and then fell away again. This suggests that perhaps more companies will decide against outsourcing their IT jobs, bringing them back to the U.S., Australia and elsewhere from foreign lands.
If this trend remains true to form, then IT program enrollments may be on an upswing. Perhaps it will be health care education programs that begin to decline in the coming years while information technology begins to see a resurgence. To be ready for it, it seems like an excellent plan to get more computer science education into the primary and secondary education levels. With earlier introduction of basic concepts, new tertiary students will be better equipped to deal with the challenges of computer programming and other technology-related subjects.
Ten years ago, most of us would not have even been able to imagine the existence of 3D printing, much less all of the practical applications of this amazing technology. Some of the exciting new ways in which 3D printing is revolutionizing the world we live in include new applications in medicine, industry, and even the way we use water.
This video demonstrates the potential of 3D printing for improving the lives of thousands by developing high-quality prosthetics for a fraction of the cost. One California company, Not Impossible Labs has taken this technology to war-torn Sudan to help alleviate the suffering of amputees. Training the locals in how to operate the machinery, they created and fit customized prostheses, helping those without resources to regain mobility. In addition to prostheses, researchers are also using 3D printing to develop potentially life-saving implants such as heart valves.
Surgeons have used 3D printing to create substances that replace human bone, and have even successfully reconstructed a severely damaged skull. The possibilities aren't limited to our physical bodies, though. Chemist Lee Cronin believes that one day it will be possible for people to purchase chemical blueprints and ink and print their own medications at home!
According to one article, two thirds of all top manufacturers use 3D printing in some of their processes. The majority of them are using its capabilities to create prototypes of new products because it is faster and less costly. However, 10% of manufacturers have found ways to successfully incorporate it into the actual production process. 3% reported that their products couldn't be made without 3D printing technology. Based on the current growth rate, the $2.5 billion in 2013 is expected to reach $15.2 billion by 2018.
Another article points out that 3D printers can use up to ten different materials simultaneously. The printer can scan the geometries of all the necessary components of a complex item and use that information to print other objects around them. Rather than shopping for the right size case to fit your expensive tablet, it's now possible to have a case printed directly onto it.
All modern water systems utilize valves, and researchers are working on using 3D technology to create new types of valves. Traditionally, precision valves that regulate the flow of not just water, but oil and other liquid substances, have been made through a careful process of first creating a pattern, or "cast" of wood or plastic. 3D printing allows unique valve designs to be created and cast more quickly and inexpensively. Surprisingly, it has also paved the way for the development of temperature sensitive smart valves.
A scientific paper outlines the details of a new ink that can print thermally actuating hydrogels to create a smart valve using a network of alginate and poly N-isopropyl acrylamide. Thermally actuating means that the ink interacts with the environment and responds differently to different temperatures. Experiments have shown that the gels increased in length by over 40% when exposed to heat and then cooled. Using this information, they developed a smart valve that reduces the flow of water by 99% through exposure to heat and increases it with exposure to cold.
Experts predict that 3D printing will make it possible to create fully functioning human organs within the next five years. This is wonderful news for the thousands of people on waiting lists for transplants and their loved ones. They also predict that 3D printers will one day become as popular as home computers, which could result in the same degree of rapid innovation as people transform their ideas into physical realities.
Since powerful new technology creates the potential for abuse of that power, experts also point out the necessity for regulation of the industry. For example, 3D printed firearms may one day be used to commit crimes. Other legal considerations include the effects of 3D printing on current copyright and intellectual property laws. The real challenge lies in achieving a balance between public safety and the rapid innovation that has produced inventions that, ten years ago, would have been considered miraculous. One thing is certain—3D printing will make the future more interesting.
So you have a new system and you are now faced with the choice between Dynamic Disk Storage or Basic Disk storage. Upon installing Windows you have two disks to choose from, as you are well aware, they are dynamic and basic. Each disk had it's advantages and disadvantages, as does anything else. The dedication can be a daunting one, especially if you're already not tech savvy, not having any prior knowledge to computers. Even if you do have prior knowledge to computers and the systems, this is very confusing. Never fear, you will find most things you need to know in order to make you decision. Let's go over the advantages and disadvantages of both systems here to get you a feel of what each displays to offer.
The first is the basic disk. The basic disk is pretty self explanatory, it's pretty basic. The disk is used in the earlier version of Windows. On a basic disk storage is divided into partitions. On the basic disk the storage types that you can create into tables, you have are MBR and GBT. On the MBR storage file the partitions called primary partitions, extended partitions, and logical partitions.
On the GBT they are called GBT partitions, these function just like primary partitions. Note here that Windows XP does not support the mutidisk storage, however win2k does. On the basic disk you can also convert to dynamic disk as well. Which here is a plausible choice when choosing what to do.
However you may be pretty limited with this disk if you choose to go with this one but if you're not going to be doing much, this may just be the choice for you.
Now let's move on here to the dynamic disk. The dynamic disk has its features in its name. Its more dynamic, more versatile. The first difference here would be that everything is divided into volumes. The dynamic disk was created because technology is every changing, therefore we needed more updated storage types.
As stated before everything is divided into volumes. The dynamic disk lets you manage you disk and volumes without having to shut down Windows.
Now you can create five different volumes within the dynamic disk simple volume, mirrored volume, striped volume, spanned volume, and RAID volume.
Each of the se has its special which are beneficial to the performance of your computer. Simple volume functions just like primary partition on the basic disk, of course it has to. Then we have the mirrored volume which creates so coping so of data within the volume. You also have the striped volume which improves the input and output of the device. Also we have the spanned volume. This combines spaces on two disks to enhance storage. Lastly we have the RAID volume, which suites the volume because it literally raids three or more disks for more storage. All of these enhance the performance of your computer.
Hopefully you have gained more insight on each of these disks and you are better able to make a decision on which is the best choice for you. Take your time, you don't want I make a mistake and end up have to change everything once you have decided on the matter.
Although the technology behind the development of anti-virus software continues to improve in leaps and bounds, the threats against computers and the data they contain still remain. With that said, keeping your antivirus software regularly updated is a critical aspect of your online security. Always keep in mind that hackers don't stick to the same threat tactics. They are also constantly looking for ways to bypass and counter the protective programs that people install in their computers. Whatever operating system you use, always see to it that you have at least two anti-virus software programs installed. The logic is simple - if the first line of defense didn't catch and contain the threat then the second program should do the trick. In fact, a lot of people make use of more than two security programs to protect their computers and data from malicious attacks.
In trying to find the best protection for your computer, there are several factors that you need to consider. For instance, what kind of data do you store in your computer? And what's the scope of this data? You need to understand that anti-virus programs can only do so much in confronting threats. And that viruses usually make use of different types of attacks when sabotaging different types of data. That said, the software you choose should have the capability to protect whatever type of data you store in your computer. Fortunately, there's no shortage of software companies that focus on developing security programs. These antivirus programs are also constantly updated to ensure that they can hinder new threats.
Here are some practical tips on how you can efficiently prevent malicious code from wreaking havoc to your computer data.
1) Choose reliable anti-virus programs. One of the best security software in the market today is the ESET NOD32 Anti-virus package. This particular software is known for its comprehensive features and ease of use. Countless tests have proven that it's very efficient in hindering threats from malware such as worms, viruses, trojans, spyware, and even rootkits. Navigating the program and its array of features is also a breeze. Beginners have nothing to worry about because installing it and keeping it updated is just a matter of clicking on a few buttons. What makes this software efficient is the fact that it scans files and data as they are either opened or executed.
2) Enforce strict policies when it comes to downloading and uploading files. This is very important especially if you oversee a computer network wherein any employee can download and upload files. Keep the policies clear and make sure that every employee only download, upload, or execute files that have been verified to be clean, valid, and threat-free. The general rule is that everyone should assume that every piece of file that the organization receives is not virus-free.
3) Disable auto-run programs and drives in your computer. One of the easiest ways that viruses enter a computer is by attaching themselves to a drive then installing themselves automatically. If this auto-run feature is disabled, it will be more difficult for viruses to wreak havoc to your computer.
4) Block suspicious files sent to your organization via email. Hackers usually make use of email gateways as virus entry points because people often unconsciously click on links contained in messages they receive. A lot of antivirus programs have features that help in stopping these types of malicious messages. However, some of these messages can still reach anybody's inbox with a warning attached by the security program. That said, you should educate your employees or staff in effectively identifying messages that may contain viruses and worms.
5) Always backup your computer data. Anti-virus programs can't guarantee that all threats are stopped and blocked. It's therefore very important that you keep backups of your data that are stored in external drives. Don't connect these external drives with your main computer network as there's a chance that the virus can spread without you knowing it.
Installing an antivirus program to your computer won't take a lot of your time. So there's no reason why you shouldn't do it.
Photo Credit: Flickr
Between keeping track of the employees work schedules, monitoring expenses, and handling the complaints of the clients, running a business entails hard work. Good software can help a business to increase its organisation and productivity. In today's digitally-powered world, business owners are increasingly using them as very useful tools to enable their businesses function and grow steadily. Here are the Top 5 business tools to help you become more organised.
1. Google Drive
Google Drive has made business owners lives easy. Software enables them to access the computer folders and files of their businesses from virtually anywhere. A business owner can also share some of the files or folders with other business owners (their contacts) with just a click of a button. Google Drive, allows the business owners to access all their business Google documents from a mobile phone. Besides, another important feature of Google Drive is that it has a free memory space of up to 15 GB. Google Drive is important for individuals who travel and also the businesses that have many offices that are located in different areas. This is because software makes sharing of files so easy and simple.
An amazing real estate software for all in one management system. LockedOn will help you manage your tasks, appointments, set up goals and help you accomplish them. You can also use it to handle your SMS,MMS and emails. With LockedOn you can handle multiple clients communication with just a few clicks and send bulk SMSs to all of them. You can also use this software to see which properties that are listed on your site received the maximum number of clicks. This way you can see what your clients want and it will help you grow as a business.
It is a big headache for businesses to keep track of their managers or even workers while they are on business trips. However, Expensify makes the whole process less painful for the businesses. Businesses can just link their debit or credit card to their Expensify so that all charges are placed on an expense report directly. In case the businesses can’t do this, they can just take pictures of the receipts using business phones and Expensify will extract the relevant information from the receipts automatically. This makes making of expense reports easier and fast. There is also a phone app and the cost is between five US dollars ($5) to ten US dollars ($10) per active account for a team & corporate users. Expensify works on any phone including Android, iPhone, Windows Phone, and Blackberry.
Square is a payment app. The app uses a small, portable debit card and credit card reader that helps businesses to make transactions conveniently and fast. Square is a good app for businesses like food trucks that have limited space. For every swipe, the business is charged 2.75%. The charges will be docked automatically from the purchases and reflected in the bank account the next day. Therefore, if the business sells a burrito for ten US dollars ($10), a net gain of $9.25 will be seen in the bank account. Bigger businesses that have annual revenue that is more than $250,000 can contact Square for custom pricing. Square can work on all operating systems and devices.
Evernote is a software that can remember everything well. Evernote puts all tasks, to-do lists and notes in one place which is convenient. However, what is remarkable is that the app is not strictly meant for computers, there is also a phone app. Software enables businesses to organise and store their recordings, notes, and even pictures. Evernote is suitable for office businesses and creative teams who have many tasks and ideas that they need to keep them organised for efficient use.
The world is digitally-powered today. This is the right time for all businesses to move to the next level. The above are some of the softwares that can help any business to grow and run smoothly. Some of these make work easier by handling business expense reports, emails, rides, and parking among others.
The cloud is transforming the world of business, and if your business isn’t yet on board, you’re running late for the revolution. The cloud harnesses the potential of always-on connectivity and lightning-quick responsiveness, and it has created a space in which a new industry of cloud services are thriving.
Companies the world over are leveraging the advantages that these new services have to offer, and they’re seeing returns in virtually every aspect of their operations. If you haven’t yet made a move to the cloud, here’s a look at what you’re missing.
Cloud services take advantage of the principles involved in economies of scale. Economies of scale are, put simply, the reduced costs that come along with spreading the costs of an enterprise over a large area – as a cloud service increases its client base, the cost per client decreases.
Thus, instead of investing in a network infrastructure capable of handling your company’s computing and storage needs, cloud services allow you to subscribe to a demand-based system that allow you to pay only for what you use. Whether it be additional features or added storage capacity, the monthly price scales up or down based on demand.
Cloud services also take on the responsibilities – and the associated costs – of system and software updates and upgrades. Not only does this shift the duties away from your in-house IT department, but it also dramatically speeds up the rate at which they’re deployed.
The same principles that reduce the costs of cloud services also enhance the quality of the services that they can provide. Cloud services are typically capable of offering a level of security far superior to that which any one of its clients could achieve within the confines of its own staff and budget.
Major cloud-based service providers utilize hardened data centres to ensure the protection of its clients’ data. These facilities employ state-of-the-art firewalls, leading-edge encryption techniques, and even armed guards.
This protection extends to the integrity of the data as well. For example, Praktika provides you with automated backups that eliminate the need to backup and store your data locally, further reducing the hardware costs and payroll hours associated with these vital, yet time-consuming tasks.
Cloud services are based on a software delivery model called Software as a Service – also known as SaaS. In this model, the software is maintained in a single location and accessed remotely using a standard web browser. As they utilize interfaces similar to any typical web application, employees typically require very little training to become proficient in their use.
This ease of use has significant advantages when it comes to their adoption and deployment – processes that once took months to complete and were often followed by intense periods of training and troubleshooting – but that’s only the beginning. Productivity is noticeably improved by systems such as these.
It’s not hard to see why – employees are able to access the service from any location with Internet access. Whether they’re at home or on the road, team members can communicate and collaborate in real time in the same virtual space. The boardroom has gone digital, and the conference table is now nothing more than a tablet.
The Cloud is Rising
The era of cloud computing has only just begun, but businesses are already clamouring to gain the competitive advantages that it offers. Indeed, there are burgeoning businesses that are basing entire business models upon the availability of cloud services.
This, of course, should be a clear warning to any company that hasn’t yet begun to consider the advantages of the cloud. Its benefits can be leveraged for you, but they can also be used against you.
Indeed, the biggest of companies are taking even bigger steps into the world of cloud computing. Entire infrastructures are being designed and deployed to serve as private clouds, complete with business-centric software solutions and in-house development teams..
The days of inflated licensing fees, bug-ridden software and long-delayed patches are over. Efficiency and agility are the name of the game in the world of cloud computing, and the cost reductions are simply too significant to overlook.
With expenditures on cloud computing expected to clock in at over $106 Billion in 2016, competition between cloud services will only continue to improve their costs and capabilities. There’s no better time than now to take a step into the cloud.
While many people have started ditching landlines to switch to cellular phones, many houses and most businesses still use regular landline phones. There are several good reasons to have them around -- they tend to be more reliable, often offer better voice quality, and for emergency services a landline provides your location immediately and reliably.
For most people this means having a standalone phone around, which is usually cordless. Why not? It's more convenient than a corded phone that ties you down.
However, cordless phones use radio waves to transmit the signal, usually in the same 2.4Ghz band as Wi-Fi. This connection to the base station "over the air" means that, as with Wi-Fi, someone who's sitting outside your home or office could listen in on what you're saying.
Wi-Fi allows you to encrypt your signal so even if someone intercepts your signal they won't be able to understand it. Can you do the same with your cordless phone?
Yes and no. Cordless phones can be encrypted so most people will find it very hard to snoop on your conversation. However, you can't add it to your phone if it doesn't have this feature already -- you'll have to buy a new phone.
The reason is that older cordless phones transmit information between handset and base station using an analog signal. This means that any snooper with a radio receiver tuned to the right frequency who can get close enough to your property to receive the signal can listen in.
This is why cordless phones shifted to digital signals and a "Digital Spread Spectrum" technology to shift frequencies rapidly and make it hard to intercept. (Digital Spread Spectrum, or DSS, was co-invented by the actress Hedy Lamarr in the middle of World War II. You can read more about this here)
The newest and most secure phones are called DECT phones because they use an advanced form of this DSS, called the Digital Enhanced Cordless Telecommunications standard, adding encryption and transmitting on 1.9 Ghz instead of 2.4Ghz so the phone won't interfere with Wi-Fi or other cordless phones. They're often labeled as "Wi-Fi Friendly" for this reason.
If you're wondering what kind of phone yours is, check your manual. If there's no DECT or DSS mentioned anywhere, yours is an analog phone.
Having a DSS or DECT phone will make it much harder for someone to snoop on you... but of course "harder" is far from "impossible." hackers have been able to crack DECT encryption for some time, but it requires advanced technical knowledge, high-end radio equipment, and specialized software -- most people aren't going to go to that kind of trouble.
Normally if someone is willing to spend that much effort to eavesdrop on you, it's because you're a highly important target -- a key person in a big company or a high-ranking government official. IN that case, you'll probably be using other security measures.
If you have a DECT phone, in other words, you can be fairly confident none of your neighbors are eavesdropping. If you're still worried, you can just use a corded phone -- but keep in mind it's always possible to add a phone tap directly on your phone line.
As a young developer on a prowl or just a casual enthusiast, you've more than likely stumbled upon the term ASP.NET. Now a fabled framework, used to produce dynamic web pages, ASP.NET first arose in the early 2002. Since then, and over a decade so far, it's been the go-to framework for anyone interested in developing. Everything has since been said about it, from good to bad, making it a technological version of a hot girl at the far end of a bar. As someone who dated that girl for a while, I can tell you the pitfalls and highlights, and myths and truths about it.
.NET is like PHP
First things first: you cannot compare these two, and not just in a way "This one’s way better, there’s no comparison". They’re just two different things. PHP is a programming language, while .NET is an application framework, meaning it's an environment for building applications. It's a platform that runs on CLR and you need expertise in Python, Visual Basic, etc... To compare the two, PHP and ASP.NET, would be like comparing a gun to a bullet. What you can compare, however, is PHP to a language that runs under .NET, such as C# or ASP+.
Future of the internet and the best technology for creating a website
This is what the fanbase will always tell you. But what a good asp.net developer will tell you is that the truth is very subjective. While it's the future platform for all (yes ALL, you read it right) Microsoft technologies, its prime use is not solely internet. The truth is: it's far more likely to make its greatest appearances on corporate intranets. On the subject of it being the best technology for building a website, well... it all depends on how good you are. It's amazing for creating dynamic web pages in general, but it's all about how you're used to doing things and how much money you’re willing to spend. The most common question asp.net developers ask themselves is "How much will it cost to host this page?".
Taking it for a test ride
At the very end, it's all about using what you've learned and hopefully mastered in theory, and doing something practical with it. Start simple, such as writing a code that makes some use of the database or create something that helps you design tables. You should also create a UI with an interface for any visitor to see. It can be an article, or anything that your prospective visitors would be interested in. Make sure to have a dashboard to which you have the admin rights, as security plays a major role here, and web development in general. This should all be an elemental part of your learning and growth.
Hopefully sharing all this knowledge has given you useful information about this magical tech beast. Now go and tame it, and eventually work effortlessly with it.
With an October 2015 deadline for US retailers and hospitality operators to migrate over to accepting EMV chip cards, current estimates are that at least $8.65 billion is being spent to prepare for the shift. But is that really the main reason customers are purchasing or upgrading a POS system? Some analysts say no -- with most of the world already operating with chip-and-pin cards, security and mobile payments are an even bigger factor. Especially for restaurants, being able to accept the latest payment options appears to be the most important benefit of a POS upgrade.
New Functionality is Hospitality Providers' Main Desire
While usability and reliability still appear to be the main factors hospitality operators use in making POS purchasing decisions, the same isn't true for POS upgrades. According to Hospitality Technology magazine, most hospitality operators looking to upgrade their systems are focused on being able to accept new payment options like mobile payments: 56% of restaurants cited "enabling new payment options" as the main consideration in their upgrade, 9% more than the next-most popular drivers (adding mobile POS functionality and preparing for the US EMV rollout). Both suppliers and restaurant operators agree that the ability to accept the latest mobile wallet payments is having a major impact on the market. At the same time, maximising security and preparing for EMV are almost as important (with 47% of restaurants citing EMV as a reason to upgrade and PCI security compliance being an issue for 45% of restaurants surveyed). Only about a quarter of restaurants were particularly concerned with integrating their POS with other systems.
Upgrades More Important than New Hardware
67% of the restaurants surveyed by Hospitality Technology said their goal at the moment is to upgrade their existing POS solution, rather than purchase a new one. Only 19% were planning to put in a POS solution from a new vendor, suggesting supplier relationships are fairly stable. Still, with 38% looking at new POS solutions which they might install after 2015, the POS industry could be seeing a change on the horizon.
Mobile Payments, Loyalty Tools, Tablet-based Software as the Most Popular Features
Among the features most restaurants are looking out for, mobile wallet functionality tops the list -- 59% of restaurants are looking to add the feature in 2015. Close on its heels are loyalty tools and tablet-based software which employees can use as they walk around. Social media integration, on the other hand, comes in at just 33%, along with many other features such as centralised POS and inventory management. Overall, the picture suggests that restaurants are looking for flexible systems that will "work the way they do" in taking orders and processing payments. Still, security is also playing a perhaps as yet unrecognized role.
Security as an Increasingly Large Driver
According to SAIC CIO Bob Fecteau, as quoted in the Wall Street Journal, payment security may be one of the largest "hidden trends" in the POS market. In his view, 2015 may see "a whole new level of security" start to take shape. Why? Unless banks and businesses don't tackle the challenges in current POS technology that are being so frequently exploited by criminals, the resulting financial impact is likely to be "significant." The crucial issue with the new mobile payment services such as Apple Pay will be how to keep them secure and avoid losses.
The rising popularity of new payment technologies comes at a time when criminals are ramping up their assaults on POS systems and mobile devices, according to Verisign iDefense Security Intelligence Services. Their 2015 "Cyber Trend and Threat Analysis" number-one top prediction is increasing attacks on mobile and POS technology. Their researchers have observed attackers developing new software to attack mobile platforms and POS devices. Despite law enforcement agencies' best efforts, the US alone is estimated to lose around $8.6 billion in credit card fraud each year. At the same time, the EMV shift means that merchants who use POS systems which are not EMV compliant but who take EMV cards accept liability for any fraudulent transactions.
For many merchants this is not much of an issue, of course: industry estimates are that 70% of the POS terminals outside the US are EMV compliant, while 40% of the cards in worldwide circulation support EMV. The highest adoption rate is in Europe (with 96% of card-present transactions using EMV), followed by Canada, Latin America, and the Caribbean. The Asia-Pacific region (including Australia) has 71% of terminals supporting EMV, but just 17% of cards.
Modified on by pentago
Currently, there are over 3.9 million jobs in America that are associated with cloud computing and out of these, 384, 478 are in information technology alone! IT professionals armed with cloud computing experience take home a median salary of $90,950. Internationally, there are a staggering 18,239,258 cloud computing jobs, with China accounting for the largest amount of these jobs, with a staggering 40.8% of the industry located in China.
These other important insights that we have learnt through WANTED Analytics, that specializes in providing data analytics especially on particular workplaces and industries. Presently, the companies database that contains over 1 billion job listings and is has a database documenting and collecting information on workplace (hiring) trends from over 150 countries.
Most Wanted Computing Certifications:
To land these top jobs, it's important to have an idea of what exact qualifications are required. The information gathered outlines that it is important to invest in certificates such as; Project Management Professional or PMP, Top Secret Sensitive Compartmented Information, Cisco Certified Network Associates or CCNA and Certified Information Systems Security professional. If want to advance your qualification and get your dream IT job in 2015, make sure that to invest in one of the above certificates to ensure that you are in with a leading edge.
Number of IT Jobs:
It is important to have a clear understanding of the employment options within the industry. Analytics completed and gathered, outline that the industry currently has 1,533,742 job openings globally! As previously noted, China is leading the employment force with 40.8% of the jobs being based in China. US is second highest employer in the field with 21.7% of the jobs being located across the US. India comes in third place accumulating 12.2% of computing and IT jobs.
Organizations Occupying the Workforce:
The top three (3) worldwide organisations leading the IT workforce include; IBM, Oracle, and Amazon. These three companies are currently leading the IT employment sector. Other companies that are renowned in the IT world include; General Dynamics, Dell, Accenture, Well Point Inc, J.P Morgan Chase & Co, Computer Sciences Corporation, Deloitte, Wells Fargo and Lockheed Martin. The companies listed can be viewed as the trendsetters in the IT employment sector.
WANTED Analytics depict a prospective 2015, with predictions and analysis that indicate a significant increase in the demand of IT related jobs. With this information at hand, the need for qualifications is becoming increasingly important.
With the above outlined information, you are now equipped to ensure that your are concentrating your efforts in the right direction to establish a prosperous and fruitful career in 2015.
Modified on by pentago
Installing the IBM DB2 database server in Linux is straightforward and relatively simple as it comes with a graphical installer image. The most important part of installing DB2 is preparing your system, which means installing hardware and software dependencies.
In Linux, a headless server must have a graphical environment to be able to run the DB2 setup wizard, so the X window system and a basic window manager, such as OpenBox, must be installed. After meeting the basic requirements, all you have to do is launch the installation wizard from the CD or ISO image.
Install the X Window System and a Window Manager
If you already have a Linux desktop environment, such as Gnome, Unity or KDE, you can skip this step and proceed to running the installer. If your server is only set up to run Apache from the command line, you must install Xorg and any other related packages from your package manager. Whether you use Yum, Apt or Pacman, simply enter the appropriate command to install X:
rpm install xorg xorg-server xorg-utils
apt-get install xorg xorg-server xorg-utils
pacman -S xorg xorg-server xorg-utils
Refer to your distribution's official repositories for the exact package names required to run X on your system. You also need to install a video driver; it only needs to be a simple, lightweight, open-source driver if you're only going to use it to install DB2. If you have a Debian or Ubuntu server, you can install all the required packages, including Xorg and a video driver, by running the following command:
apt-get install lxde
This meta-package installs the essential packages needed to log into a graphical session and run the DB2 installer, and it should only take up 50MB to 60MB of hard-disk space.
Run the DB2 Installation Wizard
After rebooting your computer and logging into a graphical session, insert the DB2 installation disk and mount it in your user's media directory. For example, enter the following commands at the Terminal prompt:
mount /dev/sr0 /run/media/username/DB2_INSTALLER
Alternatively, just open a file manager, such as Nautilus or PCManFM, and select the disc in the navigation sidebar. In a Terminal window, enter the following commands to unpack and run the installer:
gzip -d db2setup.tar.gz
tar xvf db2setup.tar
The graphical installer opens, and you can install DB2 by selecting Install a Product and then choosing the products you want to install from the disc.
Recover DB2 Database Files
Once you have DB2 installed on your computer, you can remove the graphical packages if you don't want to use them. DB2 runs entirely from the command line, and you can use a few simple commands to perform maintenance operations, such as backing up and restoring database files. To restrict usage to the system administrator, use the following command:
db2 quiesce db database-name immediate force connections
Substitute the name of your database for database-name in the command. To back up a database, use the following simple command:
db2 backup db database-name
The database is saved in your current directory. Later, you can restore the database with the following command:
db2 restore db database-name
This command automatically chooses the most recent database. If you would rather restore an earlier backup, include a time stamp with the restore command, as in the following example:
db2 restore db database-name taken at timestamp
Next, roll forward the database state to the end of the most recent log file using the following command:
db2 rollforward db database-name to isotime using local time and stop
After issuing these commands, your DB2 server is ready to be used. It contains the restored database contents with the most up-to-date log file.
At first glance, seeing an ethernet cord or optic fibre could give you almost nothing for its impression. The mixture of colors of its outside sheathing does not actually represents the complexities of this technology.
The colors can come in varied forms such as blue, red, yellow as well as mauve. But these shades are just mere coatings; what is crucially important is what is within and between them – the patches of cords and cross-over of cables.
Another noted variation is whether the cable is Cat5, Cat5e or Cat6. Aside from this, the cable certification rating is another aspect that is equally important as the others. The rating determines the safety of the cable when in an air duct or outdoors or when placed under a carpet. To get to know more about this technology, let us dig deeper on its components.
Some usable guides:
Patch Cables over Cross-over Cords
Ethernet cords’ technology is very complex, especially on the inside. The wires’ arrangement is practically essential. While there are cables that have wires designed to run on a parallel mode from one end to another, there are also those that are manufactured with cross-over cords.
The path cables, also known as the straight or parallel cables, are fashioned in such a way that it could connect computers to network devices like that of routers and switches.
However, if you are in need of a connector that could link two computers together, then you would have to use a different option as patch cables cannot do this unless one of your computer has network adapters that contain built-in cross-over support feature.
The cross-over cables, on the other hand, have the orientation that enables reversing of order of some of the wires on end included inside. So if you are using a patch cord for linking two computers, this would equate to having both to try transmitting on the same wire setup.
The Variation of Cables: Cat5, Cat5e and Cat6
Cat5 can only transmit data at a rate of 100mpbs. This option is not as fast as the others, however, if you would take a look at the ethernet cables you have bought a couple of years ago. This would probably be categorized under such type of that of the 5e.
The Cat5e can deliver gigabit ethernet having approximate rate of 1 gigabit per second. It is important to take note, however, that data rates are dependent on the bandwidth rates.
Moreover, the Cat6 cable is the next generation for the previously mentioned type of cords. The Cat6 offers 200Mhz for its bandwidth, which is basically twice that of Cat5e. Its design has made it able to ensure less noise with its connection points.
Because of its difference in terms of specifications, the Cat6 is a bit more expensive than the others. The good thing though is that this can be used to stream large amount of data like HD video.
When talking about optic fibre cables and cords, the network speed for Internet connection often lies on the router and switch capabilities so as with computers. These often determines the speed more than how the cable does.
Overall Quality of Cables and Prices
Most manufacturers are grounded on using UL-certified cables. Since 1994, this has been practiced as the UL has kept on certifying communication cables to make sure of the safety and quality of cables.
So when you are buying optic fibre or ethernet cables for your special needs, it would be highly practical and wise to look for those that are certified and packaged with the UL certification.
Under UL’s provision, there are 6 safety designations that categorize cables for varied uses. Higher the designation would equate to having higher prices. For example, those that have the CM marking can be utilized for buildings without the threat of bursting it into flames or other fire-related troubles.
The CMP markings, on the other hand, are best to use for dropped ceilings and in air ducts. There is also the CMUC and the CMX, which are good for underneath-carpet usage of outdoor setups respectively.
Know what you’re gettin'.
Modified on by pentago
Cisco Systems (CS) is one of the biggest and most successful manufacturers of networking equipment. It has not reached such a high position in the electronics industry without suffering several controversies.
This company with total assets of more than 100 billion might be facing the biggest controversy of its existence, however, as news of 'tapping” becomes widespread.
No Place to Hide
The author of book 'No Place to Hide,” Glenn Greenwald, reveals the relationship between CS and Big Brother. Conspiracy theorists have long suspected that the US government is constantly watching them and through Greenwald’s book, this might well be true.
Greenwald’s source is none other than Edward Snowden, a former contractor for the National Security Agency (NSA), revealing that the NSA has been tampering with Cisco’s products in a bid to keep track of specific people or 'targets.”
In a newsletter released in the year 2010, it was revealed that the products are being pulled out of their original route, brought to a secured location and installed with beacon implants. These same products are then placed back to their normal route and delivered to intended targets who are the wiser about the tampering.
Snowden further identifies Tailored Access Operations (TAO) employees as the ones directly responsible for the placement of the beacons.
The allegations were released together with a photograph showing a team from NSA as they install the beacons in electronic devices with the Cisco logo. According to the book, the photo came with the newsletter specifically sent to all NSA employees, citing it as a 'routine process” by the Access and Target Development Department by the NSA.
Of course, not all CS products were tampered with these beacons as the NSA chooses specific people they want to 'monitor” and proceed to bug their electronics for inside information.
Without Our Knowledge or Permission
The allegations caused a country-wide clamor with the NSA placed under a magnifying glass by the public. For CS however, the problem can be bigger as it threatens the future of this billion dollar company.
To control the damage, a top executive from the company immediately publishes a response through their official website. Mark Chandler, SVP of General Counsel and Security categorically deny any involvement with the United States on these 'beacon implants.” He goes further by saying that they do not work with any other government – including the United States.
Another top executive of the company reveals that if any tampering was made, it was done without the company’s knowledge or permission. Senior Manager of Corporate Communication Nigel Glennie further implies that the information given in the book were vague.
According to him, although the logo of the company was clearly visible through the photo, there were no specifics as to the products tampered with, the techniques used by the NSA as well as the weaknesses of the said products.
A Letter to Obama
What is interesting about this story is the fact that CS sent a letter to Obama asking for help about the situation. For some people, the mere fact that they appealed to the President of the United States verifies their knowledge of the tapping done by the NSA.
Sent by John Chambers, a CEO for the company, the letter underlines the importance of trust between them and their consumers.
'Our customers trust us to be able to deliver…products that meet the highest standards of integrity and security.”
John Chambers points out that the controversy could ruin the position of the United States as a world leader in technology. He hoped that with the intervention of Obama, the trust of American citizens on the company products remains strong, ensuring that the Internet is never impaired due to the controversy.
Although there is no question that the company is one of the biggest however, there are others in the industry that cater to a large amount of users. Large Internet-based companies such as Dropbox, Facebook and the giant Google have also expressed worry over the leaked information. In total, there are 8 technology vendors who have expressed negative responses over the leaked information.
Specifically, Dropbox, Apple, AOP, Facebook, LinkedIn, Twitter, Microsoft and Yahoo have also drafted a letter to Obama, citing the 'harmful” effects of the information control. The president along with the congress was prompted to establish laws on government surveillance with 'proportionate risks, transparent and subject to independent oversight.”
In contrast, some may say that the Foreign Intelligence Act of 1978 or FISA gives the government sufficient power to tap into electronics without violating the law.
In FISA, the US government is given the freedom to utilize both electronic and physical surveillance in the process of gathering information against any person or group threatening terrorism and espionage within the US soils.
What Happens Next?
The American public is divided over the news of NSA’s tapping of Cisco products. Although some have no problem with this controversy if it is indeed true, others are crying foul over the possible breach in their privacy. For some however, the question is: what if Cisco isn’t the only one? What if other networking companies are also being utilized for the same purpose?
Right now, the instigator of the controversy – Edward Snowden – is in Moscow on exile. Right now, it seems as though the media has moved on to more 'current” matters, but Ibrahim Baggili sheds some light on why NSA has gone through such difficult lengths to 'spy” on their targets.
The Director of the University of New Haven’s Cyber Forensics Research and Education theorizes that NSA’s main goal is to collect data from targeted individuals and track traffic between groups and persons. The ultimate goal: to protect the US soil against foreign threats – something the government has been very keen on since the 9/11 tragedy.
Vice President and Principal Analyst in Forrester Research, John Kindervag, notes that the intensified surveillance is 'inevitable.” He noted that it is only natural for the NSA to push the limits and see at which point they are reprimanded for their behaviors.
He concludes that the Internet is 'very young in the scope of world history” and that the balance between security and privacy in the digital age is still not achieved. Whether the equilibrium is found after the NSA controversy remains to be seen.
Photo source: NorthSydneyIT (northsydneyit.com.au)
Modified on by pentago
Project management thoughts
In order to remain competitive, many small companies have had to upgrade their payroll systems to a localized self service model that is much less expensive than the now soon-to-be antiquated centralized payroll system that many enterprise level companies still use.
The innovation has yet to fully hit the business mainstream; however, it is more than accepted as legitimate by the companies that have the leverage to change on a dime.
In order to implement such a system without causing an operations bottleneck that will affect employees who are expecting a paycheck, a variety of project management skills must be implemented. Although the process is to decentralize payroll, the process itself must usually be centralized around a project manager with a certain skill set.
Of the companies on record that have successfully been able to make the switch, there are many technologies that are also in place before any big moves are made. Here are just a few of the ways in which a company can use its tech and human project management resources in tandem to decentralize its payroll system.
Finding a Good BPO Provider
The secret to success in a widespread endeavor such as payroll manipulation relies on the proper outsourcing of certain aspects of the procedure. A good business partner outsourcing choice is essential to minimizing the internal human resources that are used in the change.
When AstraZeneca chose to change its entire global payroll system, it chose Northgate Arinso because of the ability of the latter company to navigate the various cultural, political and technological challenges of the many countries through which AstraZeneca would be moving its HR functions.
This is not a direct endorsement for the services of Northgate; you may not need an internationally connected company to accomplish your payroll decentralization. However, the reasoning behind the partnership is worth noting for any situation.
AstraZeneca chose Northgate because of the potential for collaboration. The data migration resources that AstraZeneca brought to the table were leveraged by the Northgate IT team as an asset to leverage the branches of AstraZeneca that had not communicated with each other for years because of automation resources making up for time lag.
The collaboration between the two companies was able to re-energize the personal efforts fo the entire AstraZeneca team without overworking any of the employees at any branch. Daily operations happened without a hitch while a minimal internal staff worked on the payroll changes backed up by Northgate specialists.
One of the most important aspects of changing payroll on this level was the fact that it was driven from both the human resources and the finance department. One might think that this would cause an overload of opinion and potential conflicts of interest because of the sometimes opposing nature of these two branches of business.
Because of the personal “glue” that the Northgate specialists provided, however, the AstraZeneca team was not overwhelmed or pressured at any time. The delegation between the two departments became an asset rather than a power grab.
This had to do with the project management skills of a single individual with a penchant for delegation – Ana Calado. Ana spearheaded the effort from within AstraZeneca by attaching herself to the Northgate team en masse through specially appointed delegates.
She made sure that all of them were on the same page politically and operationally before deploying them with orders to lead the Northgate specialists in a consolidated data migration effort that would deploy a centralized system into branches across the world with the consistency of a McDonalds (not the consistency of the burger, the consistency of operations).
One of the technologies that Ana relied upon frequently was an automated payroll software solution that managed the tiered access structure of the AstraZeneca payroll logs. She stated that the process would have gone even more smoothly if she had access to a newer technology (such as the Xero integrated Deputy time tracker, which I actually used on several occasions) that far outpace the software that she was using. The more updated the access system, the less time that is lost trying to verify the role that everyone is playing in the process.
This is especially important when you have two companies involved, one of whom needs access to certain files and logs without being able to access other records. More time was spent making sure that Northgate employees stayed out of certain areas of the AstraZeneca paylogs than it was spent giving them access to the proper channels.
Even with this setback, Ana put the wheels of collaborative project management in motion in a way that is not often seen in a single company, much less between two companies. She states in interviews that the reason that she was able to overcome technological shortcomings was because of the unity of purpose that she gave to all teams before sending them out to accomplish their mission in the best way they saw fit.
She gave them enough room to solve their own problems while the final goal was set by a centralized source, which is one of the finest examples of the use of human resources in the modern business world. Think of how easy it would be for you with the proper project management software taking her example as a lead. Get the right technology and give the right message to your team – success will follow soon after.
Modified on by pentago
image credit: Printerzone
Depending on the kind of printer you have purchased, it is possible to get a printer that is Linux ready from the box and will thus just fit in with your Operating system.
On the other hand it is also possible to get a printer which is not supported out of the box. In most instances, all you need is to install a driver for such a printer and voila! You are ready to print.
Because there are very many versions of Linux out there, covering all the printer configuration systems can be problematic.
To overcome this, there is a set up tool aptly called CUPS (Common UNIX Printer Service) that offers web-based, universal services found on all distributions that use CUPS for printing purposes.
What exactly is CUPS?
CUPS is basically a modular printer system which acts like a server printer for operating systems that are UNIX like. It can do this for both networked machines and stand-alone computers. CUPS consists of the following three (3) key systems:
Print Scheduler/Spooler which lines up printing jobs for the printer;
A Filter System which does the data conversion for the printer to format and understand the data being printed;
A back-end system that transports the data from the filters to the printer.
When CUPS is installed in the system it installs the following directories by default:
/var/spool/cups-pdf; this is the spooler directory where all the PDF files generated by CUPS are held for printing.
/var/spool/cups; this is another spooler directory where general print jobs are held before being printed.
/etc/cups; this is the configuration directory
In addition to the above, CUPS does also install it’s service in either of these two locations; /etc/rc.d/init.d OR /etc/init.d/ .
Depending on the location or distribution used, you will type in the following command to start the service (Debian example):
To stop the binary you type the following:
To restart the binary you type the following:
Remember to change the location according to how the binary is saved in your machine.
How to Configure Your Printer
image credit: ESP
This configuration is done using an integrated CUPS web based tool and the walk-through is for setting up a remote printer. This is because the process for a remote printer is slightly more complicated and will thus offer a good opportunity to learn the installation and set up procedures. Also, for those not so adventurous, you can always hire a print management professional/company to do it for you if you’re in corporate environment.
The main intention of going through the set up process is to allow UNIX to create what is known as a Postscript Printer Description (ppd) file.
This file usually contains all the features of the printer in question and the Postscript code that will be used to invoke the necessary features for print jobs of that particular printer
To configure the printer using the above mentioned web based CUPS tool, you must open your web browser and go the main page of the CUPS tool at http://localhost:631
From here one should follow these steps to set up the printer:
Step 1 – Click The “Add Printer” Button
This button is on the main page. In addition to this button, there is another one named “Manage Printers“ button; this one comes in handy if you have more than one printer, it allows you to manage all the printers which have already been installed.
Step 2 – Key In The Name, Location And Description Of The Printer You Are Setting Up
There are certain conditions you must fulfill in this page. When you are typing in the name of the printer, make sure it does not include a SPACE, “#” (hash tag) or “/” (backslash). So the name should appear as one continuous word.
The location should just state where the printer is located such as Lab 2 or Lab 4. You can use any human readable characters.
The description should be a human readable description of your printer such as HP Laser jet 6781 and can include spaces.
Once you have filled in the three input boxes you should click “continue“.
Step 3 – Select The Device From The List
At this step, you are expected to select the URI of your device. In most instances it is either remote or local. If you have are installing a local printer it will be listed in the drop down box and so all you need to do is just to select it.
If the printer is remote, select the Internet Printing Protocol as the URI of the device.
Click the “continue“ button to go to the next step.
Step 4 – Enter The URI Which Will Instruct All The Back-ends To The Exact Place Where The Printer Is Located
Because as earlier stated we are configuring a remote printer, you must enter the address of the printer. The address can take various formats which have been displayed in the in the window.
So if for the purpose of this walkthrough the printer is located in the /printers/Spool and is aptly called LaserJet then we will type in something like this; ipp:// 192.160.0.000/printers/LaserJet. It will be following this format (ipp: // hostname/ipp/).
If you are connecting to a printer server, you will have to know this information beforehand. Make sure you include the ipp:// section as failure to do so will make it impossible to connect your machine and the remote printer.
Step 5 and 6 – Select The Printer’s Manufacturer And Model
When selecting your model, make sure you get the correct one as there may be different models for different languages. If you don’t find your model then you will have to install a driver for it.
You can use Google or any other search engine to help you get a compatible or proprietary driver that’s suitable. To install the driver you have found just go back to synaptic, search for the name of the driver package and then install the application.
Once you are through, click the “Add printer button“ to add your configured printer. In most instances you will be required to key in your username and password before the installation is effected or completed.
Step 7 – Configure Any General Settings For The Printer
After your authentication has succeeded a new page will appear. This page allows you to make any other additional or specific needs you may have for your computer such as what to do when the printer jams, the kind of error or operation policy you want the printer to apply and the power save period amongst other things.
Once you are through with all the above, you are ready to print from a remote printer.
Modified on by pentago
OK, here’s the funny fact. Monitors suck. They’re too small and those bigger but quality ones are way to expensive. It’s essential for modern, productive developer to work on multiple screen devices to avoid frustration of constant moving (ALT+TAB hell) windows of text editors/web browsers/documentation around up to the point where your time spent doing that can be actually spent smarter, working.
I decided to move all my web development to a different kind of local. To the Raspberry Pi. It’s small, cheap, versatile, features active development community and most importantly it works. Also, very helpful for presentations in conference rooms with large TV (my current company setup).
Chucking in Debian, assigning it it’s own IP, plugging a fast external drive to it (auto backed up to main PC nightly) and installing all possible tools I need and might have in the future, including web/database server and Git. Fully custom, neat workstation. Works magic for me.
An introduction to introduction:
A Brief Device Profile
The nifty credit card sized $25 -$35 microcomputer takes us back to the 1980s and the time of 8-bit computing. It’s a Linux based, cheap device that doesn’t come with a monitor. You have to plug it to your TV, fit in a keyboard and a mouse, connect it to a power source, add an operating device and storage and you have a computer.
The computer started out with the idea of getting kids interested in computer science. It doesn’t have a hard disk or SSD, but it uses an SD card to boot and offer some storage space. Since it hit the market in 2012, the computer has also become popular with programmers looking for a handy and cheap device to test their projects. 500,000 of the sets were sold by September, 2012.
In fact, there’s even a version of Minecraft for the Raspberry Pi. Imagine the geeky thrill of a long rail craft ride (or navigating the Nether hellfires) on your TV screen and you may want to know how to connect the device to your TV.
Connecting To The Television
The option of connecting your Raspberry Pi to the television makes it very flexible to use. Don’t be fooled by the size of the device. The microcomputer has three output ports for visual output: HDMI, RCA and VGA.
Here’s a look at how you can plug the microcomputer to the television through each of the three ports.
The great thing about the little device is that it comes with an HDMI port. Most people today own televisions that have an HDMI port. If yours has one, all you have to do is to connect your device to the HDMI port of your TV with a cheap cable that you can get for a few dollars. This means you can connect the device to your TV set in the living room.
If you have a flat screen TV in the bedroom, that too will have an HDMI connector, so you can comfortably play Minecraft while lying in bed! In fact, if you own the microcomputer, the pieces of equipment that are must-haves apart from a power supply are an SD card and an HDMI cable. With the cable, you can connect the device to just about any PC monitor and TV available today.
But what if you don’t have an HDMI port on your TV? There are other options for you.
2. HDMI To VGA Adapter
If the monitor you want to connect to doesn’t have an HDMI port, check to see if it has a VGA connector. This is the D shaped connector that old computers had. If the monitor has a VGA port, then all you have to do is get an HDMI to VGA adapter that is readily and cheaply available.
You’ll also need to make a small change to the config.txt file used by the Pi for booting if you’re using VGA. Here’s how to do that. Pull out the SD card from the device and plug it into the memory card reader slot of your desktop PC or laptop. Open the config.txt file in a text editor and look for the following lines:
Once you’ve found the two lines, uncomment them both. This allows the device to output VGA type visual output through an HDMI adapter. It also lowers the default screen resolution to 640 X 480 to suit VGA display.
You can set the device to output a resolution that is higher than 640 X 480 if you want. To do that, look for these two lines:
Again, delete the hash tags from both the lines. Additionally, in the first line, change '1’ to '2’ and in the second, set '4’ to '16’. When you’ve done that, save the file and safely remove the SD card and putting it back into your microcomputer. Power on and enjoy nostalgic VGA visuals.
Now, I know it’s a bit of an overkill, but I simply have to say this. Firing video to just one screen is not a limit :).
By splitting the HDMI
you'll be able to display your stuff over several displays if your workflow requires it. Some of those splitters are probably a couple of times more expensive than the whole RPi and time invested in setting it up, but there, you do have a choice.
3. RCA Output
The last option that the device gives you to connect to a visual display unit is through an RCA connector. You’ll find the RCA connector right next to the audio port, on the side across the HDMI port. The RCA port is a standard port found on most TV sets made since the 80’s. However, the microcomputer is set to give preference to HDMI, so if an HDMI cable is also connected, it will automatically switch from RCA to HDMI output.
You can change the window display style of your new microcomputer as well, depending on the screen resolution of the monitor you’ve connected to. In fact, if the monitor is not of a high resolution, you may need to this. All you have to do in that case is go to the config.txt file as explained above, change the settings for overscan in the file and configure output to make it compatible with your monitor.
The Raspberry Pi is clearly a flexible device that users have found many other cool uses for. Want a digital picture frame but it’s too expensive? You can simply convert the device into a picture frame at half the price or also have it display weather reports and movies as well! Or use it to overclock your PC and create a synced MIDI and Christmas Lights affair. You can check out cool projects to create with the device here.
But simply looking for a way to connect to a monitor? You may already have that RCA cable lying around somewhere or an HDMI cable that could have you connected to a microcomputer media center in minutes. Also check out this great little (but detailed) unofficial tutorial to teach you the basics of what you can do with this surprisingly resourceful little device.
Modified on by pentago
Whenever I’m preparing to go for a vacation or a holiday travel, there are two essential devices that I cannot afford to leave behind; my tablet and my phone. Yes, I’m hooked up and I rely heavily on those. Just like you.
In fact, what worries me the most during such times is whether my phone or tablet has ample space for all the photos I would take there; my laptop is not part of the plan and I always leave it behind even though I spend majority of my working hours on it.
As the way we use our devices today has changed both industry and us personally in a great way I decided to dedicate some time and write about impact of the mobile technologies on the world we know.
1. By the year 2013, mobile phones overtook PC’s to become the most common Internet access devices across the globe.
The digital world has come a long way right from the era of green screens which were finally replaced by PC’s that also had green screens. It took almost a decade before color screen PC’s were found in the homes of average users.
After this breakthrough, the web browser became an essential element in performing many of our day to day work-related activities, though it came to pass after several years of endless inventions and innovations.
Fortunately, we are in the prime of mobile transformation where any average person can find all manner of applications on their smartphone. For example, now you can access multiple email accounts on your mobile phone.
2. In 2008, the mobile media made history in the communications industry as the first sector to hit the $1 billion revenue mark after a period of only 5 years as compared to the Internet which took 16.
As I use my mobile phone to listen to music, read e-books, watch videos, play games and utilize other productivity tools (even manage work servers, YAY!), some people are making good money from me. Since I always have this gadget with me, temptations for impulse buying are always irresistible.
According to a friend from UniqueMobiles, mobile media sector overtook the Internet even before the smart phones have fully penetrated the market, particularly in developing countries where people still use feature phones. In the next five years, there will be massive transformations in the mobile media industry.
3. Today, more than 80% of the population owns a mobile phone
The transformative effect of mobile phones across the globe is just amazing with IBM and Airtel organizing mobile development initiatives for Ghanaian students and improving the economies of developing countries. With 80% of the world’s population owning a mobile phone, developers can rest easy knowing that their work will reach as many people as possible.
4. Americans spend about 2.7 hours daily socializing on their mobile gadget and over twice the amount of that time eating
Many people spend this time sending photos, tweets and instant messages and sharing what they are doing. This has led to the popularity of various niche apps. And you can talk to anyone, whether or not they are on the same device or network.
5. By the year 2014, phone Internet usage will overtake desktop Internet usage
To fellow devs: You’ll need to prepare yourself for this transition by updating all your apps to run on mobile devices and getting native apps for each platform.
6. In 2012, there were over 1.08 billion smartphones out of the 4 billion mobile phones globally
This figure shows that a quarter of phone users worldwide have smartphones, and they will possibly want to have native like capabilities and applications. So developers should work on creating a different interface for every mobile operating system, such as iOS, Android and Windows mobile.
7. It took 7 years for smartphone users to hit the 40 million mark as compared to tablet users who reached 40 million tablets after only 2 years
Although it took several years for smartphones to get to their current platform, the tablet market capitalized on these challenges to grow even faster. Therefore, every developer should be well versed with all tablet platforms other than the usual ones, such as iOS, Android and Windows device.
8. By the year 2011, there were over 400 different types of smartphone devices on the US market, providing the consumer with a wide variety of options to choose from
With over 400 types of smartphone devices, writing custom apps for all these devices may prove untenable for one developer. Although you may decide to focus on 80% of the market, the number of devices you will need to support is still overwhelming. I hear you, responsive webdevs!
The hardest part is writing code and testing all the potential iterations which require a lot of time that you may not have. The best and most effective way to focus your limited resources and time is analyzing the market penetration and concentrating on that.
9. In 2011, smartphone usage almost tripled
Although usage does not necessarily mean users, we should focus on what we use our mobile gadgets for other than texting and emails. For example, I always listen to at least 2 hours of podcasts daily some of which are videos. This tends to change my data plan as it drives so much traffic. But the Internet connection of my tablet is faster than my home ISP and so I can use it to watch my podcasts.
This is what most employees who want to get their work done regardless of their location are yearning for. This great shift is going to transform how we manage our work businesses. With the more and more mobile devices finding their way in the workforce, IBM has decided to address the issue by rolling out BYOD (bring your own device) program. Amazing!
10. In 2011, the size of mobile traffic was eight times than that of worldwide Internet in 2000
This implies that the Mobile world is growing faster and transforming every IT aspect. Therefore, it is important for every developer to learn more about it to keep themselves updated on any new development. And this is the major reason why the Impact 2013 was a must-go event for every developer.
Most developers were delighted to attend the conference as it gave them an opportunity to meet fellow developers, interact with the product managers and learn from customers on how they are dealing with these challenges.
Modified on by pentago
KVM guest performance can be improved knowing and selecting the guest caching mode that is the best for your environment. It will help the operating system to maintain the page cache so that the storage I/O performance can be improved.
When the data gets copied on the page cache the write operations to the storage system are considered to be completed. If the data requested to read is present in the cache then page cache can satisfy its read operations.
fsync(2) is used to copy the page cache to the permanent storage whereas page cache is bypassed by the direct I/O. while using the environment of Kernel-based Virtual Machine page caches can be maintained by both, the host and guest operating systems, to provide two copies of data to the memory of the system.
Normally one of these page caches is bypassed to improve KVM guest performance. For instance, if the direct I/O operations are used by the application running in the guest then it will be better to bypass guest page cache.
If no cache is set in the guest then to turn all the I/O operations from guest to direct I/O operations on the host it will be better to bypass host page cache.
The performance of write operations to the storage system can be improved by planning the disk write cache. Even though the data is not physically transferred to the disk media but still the write operation is considered to be completed by just reaching to the disk write cache.
But there is a risk of losing the data by the disk write cache if the cache is not supported by battery backup and there is a power failure. The applications must issue fsync(2) to ensure the actual transfer of write data on physical disk media.
Normally the write performance gets significantly improved by enabling disk write cache but incase of power failure the protection and integrity of the data can be ensured only if the storage stack and the applications correctly transfer the cache data to the permanent storage.
On the other hand the write performance may suffer if the disk write cache is disabled in case of power failure but it will considerably lessen the risk of data loss, which is a good point.
Some resources regarding KVM guest performance:
Usage of Virtio device drivers
I’ve been able to improve my agency’s VPS (FreeBSD guest) performance before we turned to MacquarieTelecom’s secure hosting for government agencies.
Information about the caching modes used by Red Hat enterprises using Linux 6 for improving KVM guest performance is provided hereunder.
This default caching mode enables the host page cache for the guest but disables the write cache for disk. As a result this caching mode keeps the integrity of the data safe even if it is not properly transferred completely to the permanent storage by the applications and storage stack by using file system barriers or fsync operations.
The read performance is generally better for applications running in the guest as the host page cache is enabled in this mode. But the disabling of disk write cache adversely affects the KVM guest performance in write operations.
This caching mode enables both, the disk write cache and the host page cache, for improving the KVM guest performance. Though it improves the I/O performance for applications running in the guest but there is a risk of data loss as it is not protected from power failure. Thus this caching mode is recommended to use where potential amount of data is not required to be transferred to permanent storage.
This caching mode enables the disk write cache for the guest but disables the host page cache. This caching mode improves the KVM guest performance to its maximum because the host page cache is bypassed by the write operations and the disk write cache directly receives the data.
The integrity of data can be ensured with this caching mode if the disk write cache is supported by battery backup or the data is properly transferred by the applications or storage stack by using file system barriers or fsync operations.
But the KVM guest performance with this caching mode may not be improved to the level of the modes with enabled host page cache due to its disabled host page cache.
Cache transfer operations are completely ignored with the unsafe caching mode. This caching mode is recommended to be used for temporary data transfers only where the risk of data loss can not disturb the quality of operation, as its name suggests unsafe. Though it can be used for speeding up the installation of guests but to improve KVM guest performance you should opt for other caching modes.
To conclude, I recommend using one of the caching modes (depending on your scenario) that enable the host page cache like writethrough mode to improve the KVM guest performance for local or directly attached storage.
This mode can ensure the integrity of the data due to their acceptability to the I/O performance for applications, especially for the read operations, running in the guest.