Although the technology behind the development of anti-virus software continues to improve in leaps and bounds, the threats against computers and the data they contain still remain. With that said, keeping your antivirus software regularly updated is a critical aspect of your online security. Always keep in mind that hackers don't stick to the same threat tactics. They are also constantly looking for ways to bypass and counter the protective programs that people install in their computers. Whatever operating system you use, always see to it that you have at least two anti-virus software programs installed. The logic is simple - if the first line of defense didn't catch and contain the threat then the second program should do the trick. In fact, a lot of people make use of more than two security programs to protect their computers and data from malicious attacks.
In trying to find the best protection for your computer, there are several factors that you need to consider. For instance, what kind of data do you store in your computer? And what's the scope of this data? You need to understand that anti-virus programs can only do so much in confronting threats. And that viruses usually make use of different types of attacks when sabotaging different types of data. That said, the software you choose should have the capability to protect whatever type of data you store in your computer. Fortunately, there's no shortage of software companies that focus on developing security programs. These antivirus programs are also constantly updated to ensure that they can hinder new threats.
Here are some practical tips on how you can efficiently prevent malicious code from wreaking havoc to your computer data.
1) Choose reliable anti-virus programs. One of the best security software in the market today is the ESET NOD32 Anti-virus package. This particular software is known for its comprehensive features and ease of use. Countless tests have proven that it's very efficient in hindering threats from malware such as worms, viruses, trojans, spyware, and even rootkits. Navigating the program and its array of features is also a breeze. Beginners have nothing to worry about because installing it and keeping it updated is just a matter of clicking on a few buttons. What makes this software efficient is the fact that it scans files and data as they are either opened or executed.
2) Enforce strict policies when it comes to downloading and uploading files. This is very important especially if you oversee a computer network wherein any employee can download and upload files. Keep the policies clear and make sure that every employee only download, upload, or execute files that have been verified to be clean, valid, and threat-free. The general rule is that everyone should assume that every piece of file that the organization receives is not virus-free.
3) Disable auto-run programs and drives in your computer. One of the easiest ways that viruses enter a computer is by attaching themselves to a drive then installing themselves automatically. If this auto-run feature is disabled, it will be more difficult for viruses to wreak havoc to your computer.
4) Block suspicious files sent to your organization via email. Hackers usually make use of email gateways as virus entry points because people often unconsciously click on links contained in messages they receive. A lot of antivirus programs have features that help in stopping these types of malicious messages. However, some of these messages can still reach anybody's inbox with a warning attached by the security program. That said, you should educate your employees or staff in effectively identifying messages that may contain viruses and worms.
5) Always backup your computer data. Anti-virus programs can't guarantee that all threats are stopped and blocked. It's therefore very important that you keep backups of your data that are stored in external drives. Don't connect these external drives with your main computer network as there's a chance that the virus can spread without you knowing it.
Installing an antivirus program to your computer won't take a lot of your time. So there's no reason why you shouldn't do it.
Photo Credit: Flickr
Between keeping track of the employees work schedules, monitoring expenses, and handling the complaints of the clients, running a business entails hard work. Good software can help a business to increase its organisation and productivity. In today's digitally-powered world, business owners are increasingly using them as very useful tools to enable their businesses function and grow steadily. Here are the Top 5 business tools to help you become more organised.
1. Google Drive
Google Drive has made business owners lives easy. Software enables them to access the computer folders and files of their businesses from virtually anywhere. A business owner can also share some of the files or folders with other business owners (their contacts) with just a click of a button. Google Drive, allows the business owners to access all their business Google documents from a mobile phone. Besides, another important feature of Google Drive is that it has a free memory space of up to 15 GB. Google Drive is important for individuals who travel and also the businesses that have many offices that are located in different areas. This is because software makes sharing of files so easy and simple.
An amazing real estate software for all in one management system. LockedOn will help you manage your tasks, appointments, set up goals and help you accomplish them. You can also use it to handle your SMS,MMS and emails. With LockedOn you can handle multiple clients communication with just a few clicks and send bulk SMSs to all of them. You can also use this software to see which properties that are listed on your site received the maximum number of clicks. This way you can see what your clients want and it will help you grow as a business.
It is a big headache for businesses to keep track of their managers or even workers while they are on business trips. However, Expensify makes the whole process less painful for the businesses. Businesses can just link their debit or credit card to their Expensify so that all charges are placed on an expense report directly. In case the businesses can’t do this, they can just take pictures of the receipts using business phones and Expensify will extract the relevant information from the receipts automatically. This makes making of expense reports easier and fast. There is also a phone app and the cost is between five US dollars ($5) to ten US dollars ($10) per active account for a team & corporate users. Expensify works on any phone including Android, iPhone, Windows Phone, and Blackberry.
Square is a payment app. The app uses a small, portable debit card and credit card reader that helps businesses to make transactions conveniently and fast. Square is a good app for businesses like food trucks that have limited space. For every swipe, the business is charged 2.75%. The charges will be docked automatically from the purchases and reflected in the bank account the next day. Therefore, if the business sells a burrito for ten US dollars ($10), a net gain of $9.25 will be seen in the bank account. Bigger businesses that have annual revenue that is more than $250,000 can contact Square for custom pricing. Square can work on all operating systems and devices.
Evernote is a software that can remember everything well. Evernote puts all tasks, to-do lists and notes in one place which is convenient. However, what is remarkable is that the app is not strictly meant for computers, there is also a phone app. Software enables businesses to organise and store their recordings, notes, and even pictures. Evernote is suitable for office businesses and creative teams who have many tasks and ideas that they need to keep them organised for efficient use.
The world is digitally-powered today. This is the right time for all businesses to move to the next level. The above are some of the softwares that can help any business to grow and run smoothly. Some of these make work easier by handling business expense reports, emails, rides, and parking among others.
The cloud is transforming the world of business, and if your business isn’t yet on board, you’re running late for the revolution. The cloud harnesses the potential of always-on connectivity and lightning-quick responsiveness, and it has created a space in which a new industry of cloud services are thriving.
Companies the world over are leveraging the advantages that these new services have to offer, and they’re seeing returns in virtually every aspect of their operations. If you haven’t yet made a move to the cloud, here’s a look at what you’re missing.
Cloud services take advantage of the principles involved in economies of scale. Economies of scale are, put simply, the reduced costs that come along with spreading the costs of an enterprise over a large area – as a cloud service increases its client base, the cost per client decreases.
Thus, instead of investing in a network infrastructure capable of handling your company’s computing and storage needs, cloud services allow you to subscribe to a demand-based system that allow you to pay only for what you use. Whether it be additional features or added storage capacity, the monthly price scales up or down based on demand.
Cloud services also take on the responsibilities – and the associated costs – of system and software updates and upgrades. Not only does this shift the duties away from your in-house IT department, but it also dramatically speeds up the rate at which they’re deployed.
The same principles that reduce the costs of cloud services also enhance the quality of the services that they can provide. Cloud services are typically capable of offering a level of security far superior to that which any one of its clients could achieve within the confines of its own staff and budget.
Major cloud-based service providers utilize hardened data centres to ensure the protection of its clients’ data. These facilities employ state-of-the-art firewalls, leading-edge encryption techniques, and even armed guards.
This protection extends to the integrity of the data as well. For example, Praktika provides you with automated backups that eliminate the need to backup and store your data locally, further reducing the hardware costs and payroll hours associated with these vital, yet time-consuming tasks.
Cloud services are based on a software delivery model called Software as a Service – also known as SaaS. In this model, the software is maintained in a single location and accessed remotely using a standard web browser. As they utilize interfaces similar to any typical web application, employees typically require very little training to become proficient in their use.
This ease of use has significant advantages when it comes to their adoption and deployment – processes that once took months to complete and were often followed by intense periods of training and troubleshooting – but that’s only the beginning. Productivity is noticeably improved by systems such as these.
It’s not hard to see why – employees are able to access the service from any location with Internet access. Whether they’re at home or on the road, team members can communicate and collaborate in real time in the same virtual space. The boardroom has gone digital, and the conference table is now nothing more than a tablet.
The Cloud is Rising
The era of cloud computing has only just begun, but businesses are already clamouring to gain the competitive advantages that it offers. Indeed, there are burgeoning businesses that are basing entire business models upon the availability of cloud services.
This, of course, should be a clear warning to any company that hasn’t yet begun to consider the advantages of the cloud. Its benefits can be leveraged for you, but they can also be used against you.
Indeed, the biggest of companies are taking even bigger steps into the world of cloud computing. Entire infrastructures are being designed and deployed to serve as private clouds, complete with business-centric software solutions and in-house development teams..
The days of inflated licensing fees, bug-ridden software and long-delayed patches are over. Efficiency and agility are the name of the game in the world of cloud computing, and the cost reductions are simply too significant to overlook.
With expenditures on cloud computing expected to clock in at over $106 Billion in 2016, competition between cloud services will only continue to improve their costs and capabilities. There’s no better time than now to take a step into the cloud.
While many people have started ditching landlines to switch to cellular phones, many houses and most businesses still use regular landline phones. There are several good reasons to have them around -- they tend to be more reliable, often offer better voice quality, and for emergency services a landline provides your location immediately and reliably.
For most people this means having a standalone phone around, which is usually cordless. Why not? It's more convenient than a corded phone that ties you down.
However, cordless phones use radio waves to transmit the signal, usually in the same 2.4Ghz band as Wi-Fi. This connection to the base station "over the air" means that, as with Wi-Fi, someone who's sitting outside your home or office could listen in on what you're saying.
Wi-Fi allows you to encrypt your signal so even if someone intercepts your signal they won't be able to understand it. Can you do the same with your cordless phone?
Yes and no. Cordless phones can be encrypted so most people will find it very hard to snoop on your conversation. However, you can't add it to your phone if it doesn't have this feature already -- you'll have to buy a new phone.
The reason is that older cordless phones transmit information between handset and base station using an analog signal. This means that any snooper with a radio receiver tuned to the right frequency who can get close enough to your property to receive the signal can listen in.
This is why cordless phones shifted to digital signals and a "Digital Spread Spectrum" technology to shift frequencies rapidly and make it hard to intercept. (Digital Spread Spectrum, or DSS, was co-invented by the actress Hedy Lamarr in the middle of World War II. You can read more about this here)
The newest and most secure phones are called DECT phones because they use an advanced form of this DSS, called the Digital Enhanced Cordless Telecommunications standard, adding encryption and transmitting on 1.9 Ghz instead of 2.4Ghz so the phone won't interfere with Wi-Fi or other cordless phones. They're often labeled as "Wi-Fi Friendly" for this reason.
If you're wondering what kind of phone yours is, check your manual. If there's no DECT or DSS mentioned anywhere, yours is an analog phone.
Having a DSS or DECT phone will make it much harder for someone to snoop on you... but of course "harder" is far from "impossible." hackers have been able to crack DECT encryption for some time, but it requires advanced technical knowledge, high-end radio equipment, and specialized software -- most people aren't going to go to that kind of trouble.
Normally if someone is willing to spend that much effort to eavesdrop on you, it's because you're a highly important target -- a key person in a big company or a high-ranking government official. IN that case, you'll probably be using other security measures.
If you have a DECT phone, in other words, you can be fairly confident none of your neighbors are eavesdropping. If you're still worried, you can just use a corded phone -- but keep in mind it's always possible to add a phone tap directly on your phone line.
As a young developer on a prowl or just a casual enthusiast, you've more than likely stumbled upon the term ASP.NET. Now a fabled framework, used to produce dynamic web pages, ASP.NET first arose in the early 2002. Since then, and over a decade so far, it's been the go-to framework for anyone interested in developing. Everything has since been said about it, from good to bad, making it a technological version of a hot girl at the far end of a bar. As someone who dated that girl for a while, I can tell you the pitfalls and highlights, and myths and truths about it.
.NET is like PHP
First things first: you cannot compare these two, and not just in a way "This one’s way better, there’s no comparison". They’re just two different things. PHP is a programming language, while .NET is an application framework, meaning it's an environment for building applications. It's a platform that runs on CLR and you need expertise in Python, Visual Basic, etc... To compare the two, PHP and ASP.NET, would be like comparing a gun to a bullet. What you can compare, however, is PHP to a language that runs under .NET, such as C# or ASP+.
Future of the internet and the best technology for creating a website
This is what the fanbase will always tell you. But what a good asp.net developer will tell you is that the truth is very subjective. While it's the future platform for all (yes ALL, you read it right) Microsoft technologies, its prime use is not solely internet. The truth is: it's far more likely to make its greatest appearances on corporate intranets. On the subject of it being the best technology for building a website, well... it all depends on how good you are. It's amazing for creating dynamic web pages in general, but it's all about how you're used to doing things and how much money you’re willing to spend. The most common question asp.net developers ask themselves is "How much will it cost to host this page?".
Taking it for a test ride
At the very end, it's all about using what you've learned and hopefully mastered in theory, and doing something practical with it. Start simple, such as writing a code that makes some use of the database or create something that helps you design tables. You should also create a UI with an interface for any visitor to see. It can be an article, or anything that your prospective visitors would be interested in. Make sure to have a dashboard to which you have the admin rights, as security plays a major role here, and web development in general. This should all be an elemental part of your learning and growth.
Hopefully sharing all this knowledge has given you useful information about this magical tech beast. Now go and tame it, and eventually work effortlessly with it.
With an October 2015 deadline for US retailers and hospitality operators to migrate over to accepting EMV chip cards, current estimates are that at least $8.65 billion is being spent to prepare for the shift. But is that really the main reason customers are purchasing or upgrading a POS system? Some analysts say no -- with most of the world already operating with chip-and-pin cards, security and mobile payments are an even bigger factor. Especially for restaurants, being able to accept the latest payment options appears to be the most important benefit of a POS upgrade.
New Functionality is Hospitality Providers' Main Desire
While usability and reliability still appear to be the main factors hospitality operators use in making POS purchasing decisions, the same isn't true for POS upgrades. According to Hospitality Technology magazine, most hospitality operators looking to upgrade their systems are focused on being able to accept new payment options like mobile payments: 56% of restaurants cited "enabling new payment options" as the main consideration in their upgrade, 9% more than the next-most popular drivers (adding mobile POS functionality and preparing for the US EMV rollout). Both suppliers and restaurant operators agree that the ability to accept the latest mobile wallet payments is having a major impact on the market. At the same time, maximising security and preparing for EMV are almost as important (with 47% of restaurants citing EMV as a reason to upgrade and PCI security compliance being an issue for 45% of restaurants surveyed). Only about a quarter of restaurants were particularly concerned with integrating their POS with other systems.
Upgrades More Important than New Hardware
67% of the restaurants surveyed by Hospitality Technology said their goal at the moment is to upgrade their existing POS solution, rather than purchase a new one. Only 19% were planning to put in a POS solution from a new vendor, suggesting supplier relationships are fairly stable. Still, with 38% looking at new POS solutions which they might install after 2015, the POS industry could be seeing a change on the horizon.
Mobile Payments, Loyalty Tools, Tablet-based Software as the Most Popular Features
Among the features most restaurants are looking out for, mobile wallet functionality tops the list -- 59% of restaurants are looking to add the feature in 2015. Close on its heels are loyalty tools and tablet-based software which employees can use as they walk around. Social media integration, on the other hand, comes in at just 33%, along with many other features such as centralised POS and inventory management. Overall, the picture suggests that restaurants are looking for flexible systems that will "work the way they do" in taking orders and processing payments. Still, security is also playing a perhaps as yet unrecognized role.
Security as an Increasingly Large Driver
According to SAIC CIO Bob Fecteau, as quoted in the Wall Street Journal, payment security may be one of the largest "hidden trends" in the POS market. In his view, 2015 may see "a whole new level of security" start to take shape. Why? Unless banks and businesses don't tackle the challenges in current POS technology that are being so frequently exploited by criminals, the resulting financial impact is likely to be "significant." The crucial issue with the new mobile payment services such as Apple Pay will be how to keep them secure and avoid losses.
The rising popularity of new payment technologies comes at a time when criminals are ramping up their assaults on POS systems and mobile devices, according to Verisign iDefense Security Intelligence Services. Their 2015 "Cyber Trend and Threat Analysis" number-one top prediction is increasing attacks on mobile and POS technology. Their researchers have observed attackers developing new software to attack mobile platforms and POS devices. Despite law enforcement agencies' best efforts, the US alone is estimated to lose around $8.6 billion in credit card fraud each year. At the same time, the EMV shift means that merchants who use POS systems which are not EMV compliant but who take EMV cards accept liability for any fraudulent transactions.
For many merchants this is not much of an issue, of course: industry estimates are that 70% of the POS terminals outside the US are EMV compliant, while 40% of the cards in worldwide circulation support EMV. The highest adoption rate is in Europe (with 96% of card-present transactions using EMV), followed by Canada, Latin America, and the Caribbean. The Asia-Pacific region (including Australia) has 71% of terminals supporting EMV, but just 17% of cards.
Modified on by pentago
Currently, there are over 3.9 million jobs in America that are associated with cloud computing and out of these, 384, 478 are in information technology alone! IT professionals armed with cloud computing experience take home a median salary of $90,950. Internationally, there are a staggering 18,239,258 cloud computing jobs, with China accounting for the largest amount of these jobs, with a staggering 40.8% of the industry located in China.
These other important insights that we have learnt through WANTED Analytics, that specializes in providing data analytics especially on particular workplaces and industries. Presently, the companies database that contains over 1 billion job listings and is has a database documenting and collecting information on workplace (hiring) trends from over 150 countries.
Most Wanted Computing Certifications:
To land these top jobs, it's important to have an idea of what exact qualifications are required. The information gathered outlines that it is important to invest in certificates such as; Project Management Professional or PMP, Top Secret Sensitive Compartmented Information, Cisco Certified Network Associates or CCNA and Certified Information Systems Security professional. If want to advance your qualification and get your dream IT job in 2015, make sure that to invest in one of the above certificates to ensure that you are in with a leading edge.
Number of IT Jobs:
It is important to have a clear understanding of the employment options within the industry. Analytics completed and gathered, outline that the industry currently has 1,533,742 job openings globally! As previously noted, China is leading the employment force with 40.8% of the jobs being based in China. US is second highest employer in the field with 21.7% of the jobs being located across the US. India comes in third place accumulating 12.2% of computing and IT jobs.
Organizations Occupying the Workforce:
The top three (3) worldwide organisations leading the IT workforce include; IBM, Oracle, and Amazon. These three companies are currently leading the IT employment sector. Other companies that are renowned in the IT world include; General Dynamics, Dell, Accenture, Well Point Inc, J.P Morgan Chase & Co, Computer Sciences Corporation, Deloitte, Wells Fargo and Lockheed Martin. The companies listed can be viewed as the trendsetters in the IT employment sector.
WANTED Analytics depict a prospective 2015, with predictions and analysis that indicate a significant increase in the demand of IT related jobs. With this information at hand, the need for qualifications is becoming increasingly important.
With the above outlined information, you are now equipped to ensure that your are concentrating your efforts in the right direction to establish a prosperous and fruitful career in 2015.
Modified on by pentago
Installing the IBM DB2 database server in Linux is straightforward and relatively simple as it comes with a graphical installer image. The most important part of installing DB2 is preparing your system, which means installing hardware and software dependencies.
In Linux, a headless server must have a graphical environment to be able to run the DB2 setup wizard, so the X window system and a basic window manager, such as OpenBox, must be installed. After meeting the basic requirements, all you have to do is launch the installation wizard from the CD or ISO image.
Install the X Window System and a Window Manager
If you already have a Linux desktop environment, such as Gnome, Unity or KDE, you can skip this step and proceed to running the installer. If your server is only set up to run Apache from the command line, you must install Xorg and any other related packages from your package manager. Whether you use Yum, Apt or Pacman, simply enter the appropriate command to install X:
rpm install xorg xorg-server xorg-utils
apt-get install xorg xorg-server xorg-utils
pacman -S xorg xorg-server xorg-utils
Refer to your distribution's official repositories for the exact package names required to run X on your system. You also need to install a video driver; it only needs to be a simple, lightweight, open-source driver if you're only going to use it to install DB2. If you have a Debian or Ubuntu server, you can install all the required packages, including Xorg and a video driver, by running the following command:
apt-get install lxde
This meta-package installs the essential packages needed to log into a graphical session and run the DB2 installer, and it should only take up 50MB to 60MB of hard-disk space.
Run the DB2 Installation Wizard
After rebooting your computer and logging into a graphical session, insert the DB2 installation disk and mount it in your user's media directory. For example, enter the following commands at the Terminal prompt:
mount /dev/sr0 /run/media/username/DB2_INSTALLER
Alternatively, just open a file manager, such as Nautilus or PCManFM, and select the disc in the navigation sidebar. In a Terminal window, enter the following commands to unpack and run the installer:
gzip -d db2setup.tar.gz
tar xvf db2setup.tar
The graphical installer opens, and you can install DB2 by selecting Install a Product and then choosing the products you want to install from the disc.
Recover DB2 Database Files
Once you have DB2 installed on your computer, you can remove the graphical packages if you don't want to use them. DB2 runs entirely from the command line, and you can use a few simple commands to perform maintenance operations, such as backing up and restoring database files. To restrict usage to the system administrator, use the following command:
db2 quiesce db database-name immediate force connections
Substitute the name of your database for database-name in the command. To back up a database, use the following simple command:
db2 backup db database-name
The database is saved in your current directory. Later, you can restore the database with the following command:
db2 restore db database-name
This command automatically chooses the most recent database. If you would rather restore an earlier backup, include a time stamp with the restore command, as in the following example:
db2 restore db database-name taken at timestamp
Next, roll forward the database state to the end of the most recent log file using the following command:
db2 rollforward db database-name to isotime using local time and stop
After issuing these commands, your DB2 server is ready to be used. It contains the restored database contents with the most up-to-date log file.
At first glance, seeing an ethernet cord or optic fibre could give you almost nothing for its impression. The mixture of colors of its outside sheathing does not actually represents the complexities of this technology.
The colors can come in varied forms such as blue, red, yellow as well as mauve. But these shades are just mere coatings; what is crucially important is what is within and between them – the patches of cords and cross-over of cables.
Another noted variation is whether the cable is Cat5, Cat5e or Cat6. Aside from this, the cable certification rating is another aspect that is equally important as the others. The rating determines the safety of the cable when in an air duct or outdoors or when placed under a carpet. To get to know more about this technology, let us dig deeper on its components.
Some usable guides:
Patch Cables over Cross-over Cords
Ethernet cords’ technology is very complex, especially on the inside. The wires’ arrangement is practically essential. While there are cables that have wires designed to run on a parallel mode from one end to another, there are also those that are manufactured with cross-over cords.
The path cables, also known as the straight or parallel cables, are fashioned in such a way that it could connect computers to network devices like that of routers and switches.
However, if you are in need of a connector that could link two computers together, then you would have to use a different option as patch cables cannot do this unless one of your computer has network adapters that contain built-in cross-over support feature.
The cross-over cables, on the other hand, have the orientation that enables reversing of order of some of the wires on end included inside. So if you are using a patch cord for linking two computers, this would equate to having both to try transmitting on the same wire setup.
The Variation of Cables: Cat5, Cat5e and Cat6
Cat5 can only transmit data at a rate of 100mpbs. This option is not as fast as the others, however, if you would take a look at the ethernet cables you have bought a couple of years ago. This would probably be categorized under such type of that of the 5e.
The Cat5e can deliver gigabit ethernet having approximate rate of 1 gigabit per second. It is important to take note, however, that data rates are dependent on the bandwidth rates.
Moreover, the Cat6 cable is the next generation for the previously mentioned type of cords. The Cat6 offers 200Mhz for its bandwidth, which is basically twice that of Cat5e. Its design has made it able to ensure less noise with its connection points.
Because of its difference in terms of specifications, the Cat6 is a bit more expensive than the others. The good thing though is that this can be used to stream large amount of data like HD video.
When talking about optic fibre cables and cords, the network speed for Internet connection often lies on the router and switch capabilities so as with computers. These often determines the speed more than how the cable does.
Overall Quality of Cables and Prices
Most manufacturers are grounded on using UL-certified cables. Since 1994, this has been practiced as the UL has kept on certifying communication cables to make sure of the safety and quality of cables.
So when you are buying optic fibre or ethernet cables for your special needs, it would be highly practical and wise to look for those that are certified and packaged with the UL certification.
Under UL’s provision, there are 6 safety designations that categorize cables for varied uses. Higher the designation would equate to having higher prices. For example, those that have the CM marking can be utilized for buildings without the threat of bursting it into flames or other fire-related troubles.
The CMP markings, on the other hand, are best to use for dropped ceilings and in air ducts. There is also the CMUC and the CMX, which are good for underneath-carpet usage of outdoor setups respectively.
Know what you’re gettin'.
Modified on by pentago
Cisco Systems (CS) is one of the biggest and most successful manufacturers of networking equipment. It has not reached such a high position in the electronics industry without suffering several controversies.
This company with total assets of more than 100 billion might be facing the biggest controversy of its existence, however, as news of 'tapping” becomes widespread.
No Place to Hide
The author of book 'No Place to Hide,” Glenn Greenwald, reveals the relationship between CS and Big Brother. Conspiracy theorists have long suspected that the US government is constantly watching them and through Greenwald’s book, this might well be true.
Greenwald’s source is none other than Edward Snowden, a former contractor for the National Security Agency (NSA), revealing that the NSA has been tampering with Cisco’s products in a bid to keep track of specific people or 'targets.”
In a newsletter released in the year 2010, it was revealed that the products are being pulled out of their original route, brought to a secured location and installed with beacon implants. These same products are then placed back to their normal route and delivered to intended targets who are the wiser about the tampering.
Snowden further identifies Tailored Access Operations (TAO) employees as the ones directly responsible for the placement of the beacons.
The allegations were released together with a photograph showing a team from NSA as they install the beacons in electronic devices with the Cisco logo. According to the book, the photo came with the newsletter specifically sent to all NSA employees, citing it as a 'routine process” by the Access and Target Development Department by the NSA.
Of course, not all CS products were tampered with these beacons as the NSA chooses specific people they want to 'monitor” and proceed to bug their electronics for inside information.
Without Our Knowledge or Permission
The allegations caused a country-wide clamor with the NSA placed under a magnifying glass by the public. For CS however, the problem can be bigger as it threatens the future of this billion dollar company.
To control the damage, a top executive from the company immediately publishes a response through their official website. Mark Chandler, SVP of General Counsel and Security categorically deny any involvement with the United States on these 'beacon implants.” He goes further by saying that they do not work with any other government – including the United States.
Another top executive of the company reveals that if any tampering was made, it was done without the company’s knowledge or permission. Senior Manager of Corporate Communication Nigel Glennie further implies that the information given in the book were vague.
According to him, although the logo of the company was clearly visible through the photo, there were no specifics as to the products tampered with, the techniques used by the NSA as well as the weaknesses of the said products.
A Letter to Obama
What is interesting about this story is the fact that CS sent a letter to Obama asking for help about the situation. For some people, the mere fact that they appealed to the President of the United States verifies their knowledge of the tapping done by the NSA.
Sent by John Chambers, a CEO for the company, the letter underlines the importance of trust between them and their consumers.
'Our customers trust us to be able to deliver…products that meet the highest standards of integrity and security.”
John Chambers points out that the controversy could ruin the position of the United States as a world leader in technology. He hoped that with the intervention of Obama, the trust of American citizens on the company products remains strong, ensuring that the Internet is never impaired due to the controversy.
Although there is no question that the company is one of the biggest however, there are others in the industry that cater to a large amount of users. Large Internet-based companies such as Dropbox, Facebook and the giant Google have also expressed worry over the leaked information. In total, there are 8 technology vendors who have expressed negative responses over the leaked information.
Specifically, Dropbox, Apple, AOP, Facebook, LinkedIn, Twitter, Microsoft and Yahoo have also drafted a letter to Obama, citing the 'harmful” effects of the information control. The president along with the congress was prompted to establish laws on government surveillance with 'proportionate risks, transparent and subject to independent oversight.”
In contrast, some may say that the Foreign Intelligence Act of 1978 or FISA gives the government sufficient power to tap into electronics without violating the law.
In FISA, the US government is given the freedom to utilize both electronic and physical surveillance in the process of gathering information against any person or group threatening terrorism and espionage within the US soils.
What Happens Next?
The American public is divided over the news of NSA’s tapping of Cisco products. Although some have no problem with this controversy if it is indeed true, others are crying foul over the possible breach in their privacy. For some however, the question is: what if Cisco isn’t the only one? What if other networking companies are also being utilized for the same purpose?
Right now, the instigator of the controversy – Edward Snowden – is in Moscow on exile. Right now, it seems as though the media has moved on to more 'current” matters, but Ibrahim Baggili sheds some light on why NSA has gone through such difficult lengths to 'spy” on their targets.
The Director of the University of New Haven’s Cyber Forensics Research and Education theorizes that NSA’s main goal is to collect data from targeted individuals and track traffic between groups and persons. The ultimate goal: to protect the US soil against foreign threats – something the government has been very keen on since the 9/11 tragedy.
Vice President and Principal Analyst in Forrester Research, John Kindervag, notes that the intensified surveillance is 'inevitable.” He noted that it is only natural for the NSA to push the limits and see at which point they are reprimanded for their behaviors.
He concludes that the Internet is 'very young in the scope of world history” and that the balance between security and privacy in the digital age is still not achieved. Whether the equilibrium is found after the NSA controversy remains to be seen.
Photo source: NorthSydneyIT (northsydneyit.com.au)
Modified on by pentago
Project management thoughts
In order to remain competitive, many small companies have had to upgrade their payroll systems to a localized self service model that is much less expensive than the now soon-to-be antiquated centralized payroll system that many enterprise level companies still use.
The innovation has yet to fully hit the business mainstream; however, it is more than accepted as legitimate by the companies that have the leverage to change on a dime.
In order to implement such a system without causing an operations bottleneck that will affect employees who are expecting a paycheck, a variety of project management skills must be implemented. Although the process is to decentralize payroll, the process itself must usually be centralized around a project manager with a certain skill set.
Of the companies on record that have successfully been able to make the switch, there are many technologies that are also in place before any big moves are made. Here are just a few of the ways in which a company can use its tech and human project management resources in tandem to decentralize its payroll system.
Finding a Good BPO Provider
The secret to success in a widespread endeavor such as payroll manipulation relies on the proper outsourcing of certain aspects of the procedure. A good business partner outsourcing choice is essential to minimizing the internal human resources that are used in the change.
When AstraZeneca chose to change its entire global payroll system, it chose Northgate Arinso because of the ability of the latter company to navigate the various cultural, political and technological challenges of the many countries through which AstraZeneca would be moving its HR functions.
This is not a direct endorsement for the services of Northgate; you may not need an internationally connected company to accomplish your payroll decentralization. However, the reasoning behind the partnership is worth noting for any situation.
AstraZeneca chose Northgate because of the potential for collaboration. The data migration resources that AstraZeneca brought to the table were leveraged by the Northgate IT team as an asset to leverage the branches of AstraZeneca that had not communicated with each other for years because of automation resources making up for time lag.
The collaboration between the two companies was able to re-energize the personal efforts fo the entire AstraZeneca team without overworking any of the employees at any branch. Daily operations happened without a hitch while a minimal internal staff worked on the payroll changes backed up by Northgate specialists.
One of the most important aspects of changing payroll on this level was the fact that it was driven from both the human resources and the finance department. One might think that this would cause an overload of opinion and potential conflicts of interest because of the sometimes opposing nature of these two branches of business.
Because of the personal “glue” that the Northgate specialists provided, however, the AstraZeneca team was not overwhelmed or pressured at any time. The delegation between the two departments became an asset rather than a power grab.
This had to do with the project management skills of a single individual with a penchant for delegation – Ana Calado. Ana spearheaded the effort from within AstraZeneca by attaching herself to the Northgate team en masse through specially appointed delegates.
She made sure that all of them were on the same page politically and operationally before deploying them with orders to lead the Northgate specialists in a consolidated data migration effort that would deploy a centralized system into branches across the world with the consistency of a McDonalds (not the consistency of the burger, the consistency of operations).
One of the technologies that Ana relied upon frequently was an automated payroll software solution that managed the tiered access structure of the AstraZeneca payroll logs. She stated that the process would have gone even more smoothly if she had access to a newer technology (such as the Xero integrated Deputy time tracker, which I actually used on several occasions) that far outpace the software that she was using. The more updated the access system, the less time that is lost trying to verify the role that everyone is playing in the process.
This is especially important when you have two companies involved, one of whom needs access to certain files and logs without being able to access other records. More time was spent making sure that Northgate employees stayed out of certain areas of the AstraZeneca paylogs than it was spent giving them access to the proper channels.
Even with this setback, Ana put the wheels of collaborative project management in motion in a way that is not often seen in a single company, much less between two companies. She states in interviews that the reason that she was able to overcome technological shortcomings was because of the unity of purpose that she gave to all teams before sending them out to accomplish their mission in the best way they saw fit.
She gave them enough room to solve their own problems while the final goal was set by a centralized source, which is one of the finest examples of the use of human resources in the modern business world. Think of how easy it would be for you with the proper project management software taking her example as a lead. Get the right technology and give the right message to your team – success will follow soon after.
Modified on by pentago
image credit: Printerzone
Depending on the kind of printer you have purchased, it is possible to get a printer that is Linux ready from the box and will thus just fit in with your Operating system.
On the other hand it is also possible to get a printer which is not supported out of the box. In most instances, all you need is to install a driver for such a printer and voila! You are ready to print.
Because there are very many versions of Linux out there, covering all the printer configuration systems can be problematic.
To overcome this, there is a set up tool aptly called CUPS (Common UNIX Printer Service) that offers web-based, universal services found on all distributions that use CUPS for printing purposes.
What exactly is CUPS?
CUPS is basically a modular printer system which acts like a server printer for operating systems that are UNIX like. It can do this for both networked machines and stand-alone computers. CUPS consists of the following three (3) key systems:
Print Scheduler/Spooler which lines up printing jobs for the printer;
A Filter System which does the data conversion for the printer to format and understand the data being printed;
A back-end system that transports the data from the filters to the printer.
When CUPS is installed in the system it installs the following directories by default:
/var/spool/cups-pdf; this is the spooler directory where all the PDF files generated by CUPS are held for printing.
/var/spool/cups; this is another spooler directory where general print jobs are held before being printed.
/etc/cups; this is the configuration directory
In addition to the above, CUPS does also install it’s service in either of these two locations; /etc/rc.d/init.d OR /etc/init.d/ .
Depending on the location or distribution used, you will type in the following command to start the service (Debian example):
To stop the binary you type the following:
To restart the binary you type the following:
Remember to change the location according to how the binary is saved in your machine.
How to Configure Your Printer
image credit: ESP
This configuration is done using an integrated CUPS web based tool and the walk-through is for setting up a remote printer. This is because the process for a remote printer is slightly more complicated and will thus offer a good opportunity to learn the installation and set up procedures. Also, for those not so adventurous, you can always hire a print management professional/company to do it for you if you’re in corporate environment.
The main intention of going through the set up process is to allow UNIX to create what is known as a Postscript Printer Description (ppd) file.
This file usually contains all the features of the printer in question and the Postscript code that will be used to invoke the necessary features for print jobs of that particular printer
To configure the printer using the above mentioned web based CUPS tool, you must open your web browser and go the main page of the CUPS tool at http://localhost:631
From here one should follow these steps to set up the printer:
Step 1 – Click The “Add Printer” Button
This button is on the main page. In addition to this button, there is another one named “Manage Printers“ button; this one comes in handy if you have more than one printer, it allows you to manage all the printers which have already been installed.
Step 2 – Key In The Name, Location And Description Of The Printer You Are Setting Up
There are certain conditions you must fulfill in this page. When you are typing in the name of the printer, make sure it does not include a SPACE, “#” (hash tag) or “/” (backslash). So the name should appear as one continuous word.
The location should just state where the printer is located such as Lab 2 or Lab 4. You can use any human readable characters.
The description should be a human readable description of your printer such as HP Laser jet 6781 and can include spaces.
Once you have filled in the three input boxes you should click “continue“.
Step 3 – Select The Device From The List
At this step, you are expected to select the URI of your device. In most instances it is either remote or local. If you have are installing a local printer it will be listed in the drop down box and so all you need to do is just to select it.
If the printer is remote, select the Internet Printing Protocol as the URI of the device.
Click the “continue“ button to go to the next step.
Step 4 – Enter The URI Which Will Instruct All The Back-ends To The Exact Place Where The Printer Is Located
Because as earlier stated we are configuring a remote printer, you must enter the address of the printer. The address can take various formats which have been displayed in the in the window.
So if for the purpose of this walkthrough the printer is located in the /printers/Spool and is aptly called LaserJet then we will type in something like this; ipp:// 192.160.0.000/printers/LaserJet. It will be following this format (ipp: // hostname/ipp/).
If you are connecting to a printer server, you will have to know this information beforehand. Make sure you include the ipp:// section as failure to do so will make it impossible to connect your machine and the remote printer.
Step 5 and 6 – Select The Printer’s Manufacturer And Model
When selecting your model, make sure you get the correct one as there may be different models for different languages. If you don’t find your model then you will have to install a driver for it.
You can use Google or any other search engine to help you get a compatible or proprietary driver that’s suitable. To install the driver you have found just go back to synaptic, search for the name of the driver package and then install the application.
Once you are through, click the “Add printer button“ to add your configured printer. In most instances you will be required to key in your username and password before the installation is effected or completed.
Step 7 – Configure Any General Settings For The Printer
After your authentication has succeeded a new page will appear. This page allows you to make any other additional or specific needs you may have for your computer such as what to do when the printer jams, the kind of error or operation policy you want the printer to apply and the power save period amongst other things.
Once you are through with all the above, you are ready to print from a remote printer.
Modified on by pentago
OK, here’s the funny fact. Monitors suck. They’re too small and those bigger but quality ones are way to expensive. It’s essential for modern, productive developer to work on multiple screen devices to avoid frustration of constant moving (ALT+TAB hell) windows of text editors/web browsers/documentation around up to the point where your time spent doing that can be actually spent smarter, working.
I decided to move all my web development to a different kind of local. To the Raspberry Pi. It’s small, cheap, versatile, features active development community and most importantly it works. Also, very helpful for presentations in conference rooms with large TV (my current company setup).
Chucking in Debian, assigning it it’s own IP, plugging a fast external drive to it (auto backed up to main PC nightly) and installing all possible tools I need and might have in the future, including web/database server and Git. Fully custom, neat workstation. Works magic for me.
An introduction to introduction:
A Brief Device Profile
The nifty credit card sized $25 -$35 microcomputer takes us back to the 1980s and the time of 8-bit computing. It’s a Linux based, cheap device that doesn’t come with a monitor. You have to plug it to your TV, fit in a keyboard and a mouse, connect it to a power source, add an operating device and storage and you have a computer.
The computer started out with the idea of getting kids interested in computer science. It doesn’t have a hard disk or SSD, but it uses an SD card to boot and offer some storage space. Since it hit the market in 2012, the computer has also become popular with programmers looking for a handy and cheap device to test their projects. 500,000 of the sets were sold by September, 2012.
In fact, there’s even a version of Minecraft for the Raspberry Pi. Imagine the geeky thrill of a long rail craft ride (or navigating the Nether hellfires) on your TV screen and you may want to know how to connect the device to your TV.
Connecting To The Television
The option of connecting your Raspberry Pi to the television makes it very flexible to use. Don’t be fooled by the size of the device. The microcomputer has three output ports for visual output: HDMI, RCA and VGA.
Here’s a look at how you can plug the microcomputer to the television through each of the three ports.
The great thing about the little device is that it comes with an HDMI port. Most people today own televisions that have an HDMI port. If yours has one, all you have to do is to connect your device to the HDMI port of your TV with a cheap cable that you can get for a few dollars. This means you can connect the device to your TV set in the living room.
If you have a flat screen TV in the bedroom, that too will have an HDMI connector, so you can comfortably play Minecraft while lying in bed! In fact, if you own the microcomputer, the pieces of equipment that are must-haves apart from a power supply are an SD card and an HDMI cable. With the cable, you can connect the device to just about any PC monitor and TV available today.
But what if you don’t have an HDMI port on your TV? There are other options for you.
2. HDMI To VGA Adapter
If the monitor you want to connect to doesn’t have an HDMI port, check to see if it has a VGA connector. This is the D shaped connector that old computers had. If the monitor has a VGA port, then all you have to do is get an HDMI to VGA adapter that is readily and cheaply available.
You’ll also need to make a small change to the config.txt file used by the Pi for booting if you’re using VGA. Here’s how to do that. Pull out the SD card from the device and plug it into the memory card reader slot of your desktop PC or laptop. Open the config.txt file in a text editor and look for the following lines:
Once you’ve found the two lines, uncomment them both. This allows the device to output VGA type visual output through an HDMI adapter. It also lowers the default screen resolution to 640 X 480 to suit VGA display.
You can set the device to output a resolution that is higher than 640 X 480 if you want. To do that, look for these two lines:
Again, delete the hash tags from both the lines. Additionally, in the first line, change '1’ to '2’ and in the second, set '4’ to '16’. When you’ve done that, save the file and safely remove the SD card and putting it back into your microcomputer. Power on and enjoy nostalgic VGA visuals.
Now, I know it’s a bit of an overkill, but I simply have to say this. Firing video to just one screen is not a limit :).
By splitting the HDMI
you'll be able to display your stuff over several displays if your workflow requires it. Some of those splitters are probably a couple of times more expensive than the whole RPi and time invested in setting it up, but there, you do have a choice.
3. RCA Output
The last option that the device gives you to connect to a visual display unit is through an RCA connector. You’ll find the RCA connector right next to the audio port, on the side across the HDMI port. The RCA port is a standard port found on most TV sets made since the 80’s. However, the microcomputer is set to give preference to HDMI, so if an HDMI cable is also connected, it will automatically switch from RCA to HDMI output.
You can change the window display style of your new microcomputer as well, depending on the screen resolution of the monitor you’ve connected to. In fact, if the monitor is not of a high resolution, you may need to this. All you have to do in that case is go to the config.txt file as explained above, change the settings for overscan in the file and configure output to make it compatible with your monitor.
The Raspberry Pi is clearly a flexible device that users have found many other cool uses for. Want a digital picture frame but it’s too expensive? You can simply convert the device into a picture frame at half the price or also have it display weather reports and movies as well! Or use it to overclock your PC and create a synced MIDI and Christmas Lights affair. You can check out cool projects to create with the device here.
But simply looking for a way to connect to a monitor? You may already have that RCA cable lying around somewhere or an HDMI cable that could have you connected to a microcomputer media center in minutes. Also check out this great little (but detailed) unofficial tutorial to teach you the basics of what you can do with this surprisingly resourceful little device.
Modified on by pentago
Whenever I’m preparing to go for a vacation or a holiday travel, there are two essential devices that I cannot afford to leave behind; my tablet and my phone. Yes, I’m hooked up and I rely heavily on those. Just like you.
In fact, what worries me the most during such times is whether my phone or tablet has ample space for all the photos I would take there; my laptop is not part of the plan and I always leave it behind even though I spend majority of my working hours on it.
As the way we use our devices today has changed both industry and us personally in a great way I decided to dedicate some time and write about impact of the mobile technologies on the world we know.
1. By the year 2013, mobile phones overtook PC’s to become the most common Internet access devices across the globe.
The digital world has come a long way right from the era of green screens which were finally replaced by PC’s that also had green screens. It took almost a decade before color screen PC’s were found in the homes of average users.
After this breakthrough, the web browser became an essential element in performing many of our day to day work-related activities, though it came to pass after several years of endless inventions and innovations.
Fortunately, we are in the prime of mobile transformation where any average person can find all manner of applications on their smartphone. For example, now you can access multiple email accounts on your mobile phone.
2. In 2008, the mobile media made history in the communications industry as the first sector to hit the $1 billion revenue mark after a period of only 5 years as compared to the Internet which took 16.
As I use my mobile phone to listen to music, read e-books, watch videos, play games and utilize other productivity tools (even manage work servers, YAY!), some people are making good money from me. Since I always have this gadget with me, temptations for impulse buying are always irresistible.
According to a friend from UniqueMobiles, mobile media sector overtook the Internet even before the smart phones have fully penetrated the market, particularly in developing countries where people still use feature phones. In the next five years, there will be massive transformations in the mobile media industry.
3. Today, more than 80% of the population owns a mobile phone
The transformative effect of mobile phones across the globe is just amazing with IBM and Airtel organizing mobile development initiatives for Ghanaian students and improving the economies of developing countries. With 80% of the world’s population owning a mobile phone, developers can rest easy knowing that their work will reach as many people as possible.
4. Americans spend about 2.7 hours daily socializing on their mobile gadget and over twice the amount of that time eating
Many people spend this time sending photos, tweets and instant messages and sharing what they are doing. This has led to the popularity of various niche apps. And you can talk to anyone, whether or not they are on the same device or network.
5. By the year 2014, phone Internet usage will overtake desktop Internet usage
To fellow devs: You’ll need to prepare yourself for this transition by updating all your apps to run on mobile devices and getting native apps for each platform.
6. In 2012, there were over 1.08 billion smartphones out of the 4 billion mobile phones globally
This figure shows that a quarter of phone users worldwide have smartphones, and they will possibly want to have native like capabilities and applications. So developers should work on creating a different interface for every mobile operating system, such as iOS, Android and Windows mobile.
7. It took 7 years for smartphone users to hit the 40 million mark as compared to tablet users who reached 40 million tablets after only 2 years
Although it took several years for smartphones to get to their current platform, the tablet market capitalized on these challenges to grow even faster. Therefore, every developer should be well versed with all tablet platforms other than the usual ones, such as iOS, Android and Windows device.
8. By the year 2011, there were over 400 different types of smartphone devices on the US market, providing the consumer with a wide variety of options to choose from
With over 400 types of smartphone devices, writing custom apps for all these devices may prove untenable for one developer. Although you may decide to focus on 80% of the market, the number of devices you will need to support is still overwhelming. I hear you, responsive webdevs!
The hardest part is writing code and testing all the potential iterations which require a lot of time that you may not have. The best and most effective way to focus your limited resources and time is analyzing the market penetration and concentrating on that.
9. In 2011, smartphone usage almost tripled
Although usage does not necessarily mean users, we should focus on what we use our mobile gadgets for other than texting and emails. For example, I always listen to at least 2 hours of podcasts daily some of which are videos. This tends to change my data plan as it drives so much traffic. But the Internet connection of my tablet is faster than my home ISP and so I can use it to watch my podcasts.
This is what most employees who want to get their work done regardless of their location are yearning for. This great shift is going to transform how we manage our work businesses. With the more and more mobile devices finding their way in the workforce, IBM has decided to address the issue by rolling out BYOD (bring your own device) program. Amazing!
10. In 2011, the size of mobile traffic was eight times than that of worldwide Internet in 2000
This implies that the Mobile world is growing faster and transforming every IT aspect. Therefore, it is important for every developer to learn more about it to keep themselves updated on any new development. And this is the major reason why the Impact 2013 was a must-go event for every developer.
Most developers were delighted to attend the conference as it gave them an opportunity to meet fellow developers, interact with the product managers and learn from customers on how they are dealing with these challenges.
Modified on by pentago
KVM guest performance can be improved knowing and selecting the guest caching mode that is the best for your environment. It will help the operating system to maintain the page cache so that the storage I/O performance can be improved.
When the data gets copied on the page cache the write operations to the storage system are considered to be completed. If the data requested to read is present in the cache then page cache can satisfy its read operations.
fsync(2) is used to copy the page cache to the permanent storage whereas page cache is bypassed by the direct I/O. while using the environment of Kernel-based Virtual Machine page caches can be maintained by both, the host and guest operating systems, to provide two copies of data to the memory of the system.
Normally one of these page caches is bypassed to improve KVM guest performance. For instance, if the direct I/O operations are used by the application running in the guest then it will be better to bypass guest page cache.
If no cache is set in the guest then to turn all the I/O operations from guest to direct I/O operations on the host it will be better to bypass host page cache.
The performance of write operations to the storage system can be improved by planning the disk write cache. Even though the data is not physically transferred to the disk media but still the write operation is considered to be completed by just reaching to the disk write cache.
But there is a risk of losing the data by the disk write cache if the cache is not supported by battery backup and there is a power failure. The applications must issue fsync(2) to ensure the actual transfer of write data on physical disk media.
Normally the write performance gets significantly improved by enabling disk write cache but incase of power failure the protection and integrity of the data can be ensured only if the storage stack and the applications correctly transfer the cache data to the permanent storage.
On the other hand the write performance may suffer if the disk write cache is disabled in case of power failure but it will considerably lessen the risk of data loss, which is a good point.
Some resources regarding KVM guest performance:
Usage of Virtio device drivers
I’ve been able to improve my agency’s VPS (FreeBSD guest) performance before we turned to MacquarieTelecom’s secure hosting for government agencies.
Information about the caching modes used by Red Hat enterprises using Linux 6 for improving KVM guest performance is provided hereunder.
This default caching mode enables the host page cache for the guest but disables the write cache for disk. As a result this caching mode keeps the integrity of the data safe even if it is not properly transferred completely to the permanent storage by the applications and storage stack by using file system barriers or fsync operations.
The read performance is generally better for applications running in the guest as the host page cache is enabled in this mode. But the disabling of disk write cache adversely affects the KVM guest performance in write operations.
This caching mode enables both, the disk write cache and the host page cache, for improving the KVM guest performance. Though it improves the I/O performance for applications running in the guest but there is a risk of data loss as it is not protected from power failure. Thus this caching mode is recommended to use where potential amount of data is not required to be transferred to permanent storage.
This caching mode enables the disk write cache for the guest but disables the host page cache. This caching mode improves the KVM guest performance to its maximum because the host page cache is bypassed by the write operations and the disk write cache directly receives the data.
The integrity of data can be ensured with this caching mode if the disk write cache is supported by battery backup or the data is properly transferred by the applications or storage stack by using file system barriers or fsync operations.
But the KVM guest performance with this caching mode may not be improved to the level of the modes with enabled host page cache due to its disabled host page cache.
Cache transfer operations are completely ignored with the unsafe caching mode. This caching mode is recommended to be used for temporary data transfers only where the risk of data loss can not disturb the quality of operation, as its name suggests unsafe. Though it can be used for speeding up the installation of guests but to improve KVM guest performance you should opt for other caching modes.
To conclude, I recommend using one of the caching modes (depending on your scenario) that enable the host page cache like writethrough mode to improve the KVM guest performance for local or directly attached storage.
This mode can ensure the integrity of the data due to their acceptability to the I/O performance for applications, especially for the read operations, running in the guest.