Technology rentals are a great way to save you and your business a lot of time and money. Temporary, short-term technology rentals are the perfect solutions for your short-term needs. A lot of event professionals and business professionals are constantly going to trade shows, conventions, conferences, and seminars or hosting things like corporate training events or business meetings and all of these situations have a need for short-term technology rentals. But in order to know exactly what it means to rent technology, you need to hear about it from a professional.
Rentacomputer has been in the business of computer, technology, and AV rentals for over 25 years. However, even after all those years, there are still people who don't quite understand how renting technology works and how it can save you and your business time and money. Rentacomputer provides temporary business-to-business technology, such as computer rentals, office equipment rentals, and AV rentals for business events all over the United States, Western Europe, and Australia.
If that sounds interesting, and it should, you can read more about what Rentacomputer does and how it can benefit you and your business here!
Monday, December 21, 2015
Monday, December 7, 2015
These Free Chrome Extensions Are Exactly What You Need
Chrome is one of the most widely used internet browsers and users are always looking for ways to make their experience on Chrome better. One of the best things about Chrome is that it is very flexible as it allows third-party extensions to be used with ease. However, finding good extensions isn't always easy. Below you will find some Chrome hacks that will completely change the way you use Chrome. The best part about all of these extensions is that they're completely free!
Sunrise Calendar
Google Calendar is free and very powerful, but it's a little dull in the aesthetics department and is also missing some pretty important features. The Sunrise Calendar extension fixes both of these problems. In addition to that, it also comes with a mobile companion app for iOS and Android that could be better than the actual extension.
Calculator
Some Chrome extensions are complicated, but the best ones don't have to be. Calculator for Chrome is probably one of the most useful Chrome extensions out there. No longer will you have to use the calculator on your smartphone or open a separate calculator app on your desktop every single time you need to add up some numbers.
NEnhancer
Netflix is the most popular video streaming service around today. You use it all the time, your mom uses it all the time, practically every single person you know uses it all the time, but there are ways to make the service even better, especially on Chrome. NEhancer is a free Chrome extension that adds a lot of key information onto Netflix's website for TV shows and movies. This information includes things like trailers, Rotten Tomato ratings, IMDb info and a lot more.
WorkFlowy
WorkFlowy is the ultimate list extension for Chrome. WorkFlowy is, according to its creator, "simple enough for a shopping list, and powerful enough to run a company." The key features include infinitely nested lists, tag and filter list items, full offline functionality, collaboration with others with live syncing, click to edit, mark items as complete, zoom in on any sub-list, works on phones and tablets with live syncing between devices, automatic syncing between your phone, tablet and computer, add notes to any list item, quick expand and collapse lists, and instant full text search.
Criptext for Gmail
Criptext is a very comprehensive Gmail enhancement that adds a lot of important features to your Gmail experience. Almost everyone uses Gmail these days so extensions like Criptext are pretty handy. Criptext allows users to un-send emails before or after they've been read with the click of a button. In addition to that, they can also send self-destructing emails and attachments Mission Impossible style. Moreover, Criptext allows you to attach secure files up to 100MB in size and know exactly when emails are opened and when attachments are downloaded.
These are just a few of the Google Chrome hacks that are available for your internet browser. These extensions help make things easier and more manageable and add a lot of benefits to using Chrome. If you want to make what is arguably the best internet browser out there even better then take a look at some of these awesome extensions.
Content originally published here
Sharing this story on social media? Use these hashtags! #Google #Chrome #NEnhancer #Criptix #GoogleCalendar #SunriseCalendar #WorkyFlowy #Gmail
Sunrise Calendar
Google Calendar is free and very powerful, but it's a little dull in the aesthetics department and is also missing some pretty important features. The Sunrise Calendar extension fixes both of these problems. In addition to that, it also comes with a mobile companion app for iOS and Android that could be better than the actual extension.
Calculator
Some Chrome extensions are complicated, but the best ones don't have to be. Calculator for Chrome is probably one of the most useful Chrome extensions out there. No longer will you have to use the calculator on your smartphone or open a separate calculator app on your desktop every single time you need to add up some numbers.
NEnhancer
Netflix is the most popular video streaming service around today. You use it all the time, your mom uses it all the time, practically every single person you know uses it all the time, but there are ways to make the service even better, especially on Chrome. NEhancer is a free Chrome extension that adds a lot of key information onto Netflix's website for TV shows and movies. This information includes things like trailers, Rotten Tomato ratings, IMDb info and a lot more.
WorkFlowy
WorkFlowy is the ultimate list extension for Chrome. WorkFlowy is, according to its creator, "simple enough for a shopping list, and powerful enough to run a company." The key features include infinitely nested lists, tag and filter list items, full offline functionality, collaboration with others with live syncing, click to edit, mark items as complete, zoom in on any sub-list, works on phones and tablets with live syncing between devices, automatic syncing between your phone, tablet and computer, add notes to any list item, quick expand and collapse lists, and instant full text search.
Criptext for Gmail
Criptext is a very comprehensive Gmail enhancement that adds a lot of important features to your Gmail experience. Almost everyone uses Gmail these days so extensions like Criptext are pretty handy. Criptext allows users to un-send emails before or after they've been read with the click of a button. In addition to that, they can also send self-destructing emails and attachments Mission Impossible style. Moreover, Criptext allows you to attach secure files up to 100MB in size and know exactly when emails are opened and when attachments are downloaded.
These are just a few of the Google Chrome hacks that are available for your internet browser. These extensions help make things easier and more manageable and add a lot of benefits to using Chrome. If you want to make what is arguably the best internet browser out there even better then take a look at some of these awesome extensions.
Content originally published here
Sharing this story on social media? Use these hashtags! #Google #Chrome #NEnhancer #Criptix #GoogleCalendar #SunriseCalendar #WorkyFlowy #Gmail
Wednesday, December 2, 2015
HGST Debuts Ultrastar He10 10TB Helium-Filled Hard Drive
I bet you never thought you would live to see the day we had 10TB hard drives did you? Well, helium-filled hard drives have finally reached this point by creating the first 10TB hard drives that use conventional recording methods.
Western Digital subsidiary HGST has created the Ultrastar He10 which, interestingly enough, is not the first 10B helium drive. HGST created the first real 10TB hard drive back in the summer. However, this new drive is the only helium drive that has 10TB and uses perpendicular magnetic recording (PMR), the standard recording technology for hard drives over the last 10 years.
The first 10TB drive from HGST used shingled magnetic recording (SMR). SMR uses magnetic tracks that overlap to increase capacity. The downside here is that there is very little guard space between the tracks, which makes rewriting data much more difficult and longer as adjacent tracks may need to be rewritten as well. That is why SMR is better-suited for cold storage as opposed to routine recording.
PMR has some negatives too. The problem with PMR is that is is coming to the limits of its potential capacity. Regardless, HGST has pulled off something quite amazing by stuffing seven platters into a standard 1-inch drive. Helium drives from other companies, like Seagate, top out at 8TB. Both Seagate and HGST have been putting money into new technology known as heat-assisted magnetic recording, which allows for a much higher capacity and is better suited for everyday use though these hard drives won't start showing up until next year at the earliest.
Just like with the earlier helium drives from HGST, the Ultrastar He10 is being marketed at enterprise and server use and will have a price tag of around $800. Compared to air, helium isn't as dense which means that there will be less of a drag on the moving parts of the drive. This decreased friction when combined with hermetic seals that keep out contaminants and humidity allow these drives to run at cooler temperatures than a standard HDD. This will also reduce your energy costs. This makes them perfect for use in servers though the direct impact on consumers will be minimal. Regardless, the larger capacity could benefit the cloud services that we are relying on more and more.
Content originally published here
Sharing this story on social media? Use these hashtags! #WesternDigital #HGST #UltrastarHe10 #HardDrives
Western Digital subsidiary HGST has created the Ultrastar He10 which, interestingly enough, is not the first 10B helium drive. HGST created the first real 10TB hard drive back in the summer. However, this new drive is the only helium drive that has 10TB and uses perpendicular magnetic recording (PMR), the standard recording technology for hard drives over the last 10 years.
The first 10TB drive from HGST used shingled magnetic recording (SMR). SMR uses magnetic tracks that overlap to increase capacity. The downside here is that there is very little guard space between the tracks, which makes rewriting data much more difficult and longer as adjacent tracks may need to be rewritten as well. That is why SMR is better-suited for cold storage as opposed to routine recording.
PMR has some negatives too. The problem with PMR is that is is coming to the limits of its potential capacity. Regardless, HGST has pulled off something quite amazing by stuffing seven platters into a standard 1-inch drive. Helium drives from other companies, like Seagate, top out at 8TB. Both Seagate and HGST have been putting money into new technology known as heat-assisted magnetic recording, which allows for a much higher capacity and is better suited for everyday use though these hard drives won't start showing up until next year at the earliest.
Just like with the earlier helium drives from HGST, the Ultrastar He10 is being marketed at enterprise and server use and will have a price tag of around $800. Compared to air, helium isn't as dense which means that there will be less of a drag on the moving parts of the drive. This decreased friction when combined with hermetic seals that keep out contaminants and humidity allow these drives to run at cooler temperatures than a standard HDD. This will also reduce your energy costs. This makes them perfect for use in servers though the direct impact on consumers will be minimal. Regardless, the larger capacity could benefit the cloud services that we are relying on more and more.
Content originally published here
Sharing this story on social media? Use these hashtags! #WesternDigital #HGST #UltrastarHe10 #HardDrives
Wednesday, November 18, 2015
The Fastest Desktop PC Ever
In the words of LT Pete "Maverick" Mitchell, "I feel the need....the need for speed!" Well, at least that's what Intel is feeling as it just announced that it will be putting its absurd 72-core Knight's Landing supercomputer chip into production. However, that isn't even the most exciting part. The most exciting part is that the Knight's Landing, which is Intel's fastest chip to date, will be going into desktop workstations that will contain enough computational power to make Doc Brown's overloaded speakers look like a kids karaoke machine.
PC World recently reported that the company is planning on shipping a "limited number of workstations" that will come equipped with the super-fast supercomputer chip in the first half of 2016. As a result PC makers will have the ability to adopt Intel's supercomputer silicon in desktop models on a greater scale, according to Intel's Charles Wuischpard. I don't know about you, but I'm not sure I can handle a chip like this being in something that is sitting in my room or my office. But then again the power is very alluring.
The main question that is going to be on everyone's mind is, of course, what kind of specs we can expect from the Knight's Landing chip. This chip differs from the ones currently in your desktop in the fact that this supercomputer processor puts all of its cores onto a single piece of silicon. Then, all of these processors are bundled up with 16GB of on-package MCDRAM memory into a PCI-E add-in card. This is very similar to the ridiculous Nvidia GPUs that are currently being installed on supercomputers around the world.
Once you have all of this packed together and installed inside your computer you are left with a piece of hardware that is capable of computing single-precision calculations at a rate of 8 teraflops, or double-precision calculations at over 3 teraflops. PC World also noted that this chip will be used by the United States Department of Energy inside of its 9,300-core Cori supercomputer and, in addition to that, Intel has also claimed that 50 different manufacturers will ship systems that use this chip in time.
In the meantime the desktop workstations, which are essentially jacked up versions of the CAD, graphics and film editing computers that are used in offices where money is as abundant as air, will be made available to researchers who are interested in using a supercomputer but are otherwise unable to gain access to one. The idea is that these individuals will be able to develop and test code on the workstation before shipping it out, error-free, to a supercomputer somewhere in the future. I highly doubt that you will be getting something like this in your iMac anytime soon, though Intel is, at the very least, attempting to put this chip into the hands of people that would otherwise have no access to such a device.
Content originally published here
Sharing this story on social media? Use these hashtags! #Intel #KnightsLanding #Supercomputer
PC World recently reported that the company is planning on shipping a "limited number of workstations" that will come equipped with the super-fast supercomputer chip in the first half of 2016. As a result PC makers will have the ability to adopt Intel's supercomputer silicon in desktop models on a greater scale, according to Intel's Charles Wuischpard. I don't know about you, but I'm not sure I can handle a chip like this being in something that is sitting in my room or my office. But then again the power is very alluring.
The main question that is going to be on everyone's mind is, of course, what kind of specs we can expect from the Knight's Landing chip. This chip differs from the ones currently in your desktop in the fact that this supercomputer processor puts all of its cores onto a single piece of silicon. Then, all of these processors are bundled up with 16GB of on-package MCDRAM memory into a PCI-E add-in card. This is very similar to the ridiculous Nvidia GPUs that are currently being installed on supercomputers around the world.
Once you have all of this packed together and installed inside your computer you are left with a piece of hardware that is capable of computing single-precision calculations at a rate of 8 teraflops, or double-precision calculations at over 3 teraflops. PC World also noted that this chip will be used by the United States Department of Energy inside of its 9,300-core Cori supercomputer and, in addition to that, Intel has also claimed that 50 different manufacturers will ship systems that use this chip in time.
In the meantime the desktop workstations, which are essentially jacked up versions of the CAD, graphics and film editing computers that are used in offices where money is as abundant as air, will be made available to researchers who are interested in using a supercomputer but are otherwise unable to gain access to one. The idea is that these individuals will be able to develop and test code on the workstation before shipping it out, error-free, to a supercomputer somewhere in the future. I highly doubt that you will be getting something like this in your iMac anytime soon, though Intel is, at the very least, attempting to put this chip into the hands of people that would otherwise have no access to such a device.
Content originally published here
Sharing this story on social media? Use these hashtags! #Intel #KnightsLanding #Supercomputer
Wednesday, November 4, 2015
Microsoft Reveals End-Of-Sales Date For Windows 7 And Windows 8.1 Devices
By now most Windows users have upgraded to Windows 10. However, some people are still using Windows 8.1 or Windows 7 while others are looking to buy a new laptop, desktop or tablet with Windows 8.1 or 7 pre-installed. If you are one of these people then you have less than a year to do so as Microsoft has just announced that it will be discontinuing both Windows 8.1 and Windows 7.
According to the Microsoft Windows Lifecycle Fact Sheet, October 31st, 2016 is marked as the "end of sales for PCs with Windows 8.1 or Windows 7 pre-installed. After October 31st, 2016 the only option for customers will be to purchase new computers with Windows 10 installed. The only exception to this will be businesses with license agreements that entitle them to choose which version of Windows they wish to have pre-installed.
This deadline will be putting a lot of pressure on consumers who have become quite attached to Windows 7 and may be very apprehensive to upgrading to Windows 10 if they buy a new computer. However, this is a logical and necessary step for Microsoft in its goal of having more than 1 billion Windows 10 devices powered up. This also goes along with the company's message that Windows 10 is capable of bringing together desktops, laptops, tablets, and smartphones with apps that can run across every platform.
What Windows 7 users don't really realize is that this is actually really good for them. Microsoft usually sets the end-of-date for each version of Windows two years after the release of a new operating system. This means that the cutoff date for Windows 7 should have been October of 2014, which was two years after the launch of Windows 8. However, the severe lack of consumer demand for Windows 8 insisted that Microsoft keep the operating system around a while longer. Windows 8, if you remember, was Microsoft's attempt at making a touch-friendly operating system, though it transitioned horribly onto non-touch devices and was highly panned by users.
If you want to continue using Windows 7 on your existing PC or laptop then you don't have to worry. Microsoft has announced that extended technical support will be available until January 14, 2020, meaning that you will be able to continue receiving patches, bug fixes, and other updates. This support is also offered to Windows 8.1 users, though it is extended to January 23rd, 2023.
Despite these deadlines, Microsoft is still heavily pushing Windows 10 to users. The new operating system, which came to users at the end of July of this year, is available as a free upgrade to users of Windows 7 or Windows 8.1 for the first year of its existence. Microsoft keeps sneaking in pop-ups on the regular to remind users that Windows 10 is available. Microsoft also classified Windows 10 as an "optional update" and, as early as next year, expects to change that to a "recommended update" according to Windows and Devices Group Executive Vice President terry Myerson.
I have been using Windows 10 since it launched and it's really good. It takes the best things from Windows 7 and puts them in a modern format with new features and apps that really do bring all of your devices together. In addition to that, non-touchscreen users don't feel like they've been given second billing to touchscreen users, something that Windows 8 seemed to do. If you are looking for a Windows 7 or Windows 8.1 computer or laptop, you better move quick because this time next year that won't be an option.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #Microsoft #Windows10
According to the Microsoft Windows Lifecycle Fact Sheet, October 31st, 2016 is marked as the "end of sales for PCs with Windows 8.1 or Windows 7 pre-installed. After October 31st, 2016 the only option for customers will be to purchase new computers with Windows 10 installed. The only exception to this will be businesses with license agreements that entitle them to choose which version of Windows they wish to have pre-installed.
This deadline will be putting a lot of pressure on consumers who have become quite attached to Windows 7 and may be very apprehensive to upgrading to Windows 10 if they buy a new computer. However, this is a logical and necessary step for Microsoft in its goal of having more than 1 billion Windows 10 devices powered up. This also goes along with the company's message that Windows 10 is capable of bringing together desktops, laptops, tablets, and smartphones with apps that can run across every platform.
What Windows 7 users don't really realize is that this is actually really good for them. Microsoft usually sets the end-of-date for each version of Windows two years after the release of a new operating system. This means that the cutoff date for Windows 7 should have been October of 2014, which was two years after the launch of Windows 8. However, the severe lack of consumer demand for Windows 8 insisted that Microsoft keep the operating system around a while longer. Windows 8, if you remember, was Microsoft's attempt at making a touch-friendly operating system, though it transitioned horribly onto non-touch devices and was highly panned by users.
If you want to continue using Windows 7 on your existing PC or laptop then you don't have to worry. Microsoft has announced that extended technical support will be available until January 14, 2020, meaning that you will be able to continue receiving patches, bug fixes, and other updates. This support is also offered to Windows 8.1 users, though it is extended to January 23rd, 2023.
Despite these deadlines, Microsoft is still heavily pushing Windows 10 to users. The new operating system, which came to users at the end of July of this year, is available as a free upgrade to users of Windows 7 or Windows 8.1 for the first year of its existence. Microsoft keeps sneaking in pop-ups on the regular to remind users that Windows 10 is available. Microsoft also classified Windows 10 as an "optional update" and, as early as next year, expects to change that to a "recommended update" according to Windows and Devices Group Executive Vice President terry Myerson.
I have been using Windows 10 since it launched and it's really good. It takes the best things from Windows 7 and puts them in a modern format with new features and apps that really do bring all of your devices together. In addition to that, non-touchscreen users don't feel like they've been given second billing to touchscreen users, something that Windows 8 seemed to do. If you are looking for a Windows 7 or Windows 8.1 computer or laptop, you better move quick because this time next year that won't be an option.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #Microsoft #Windows10
Monday, November 2, 2015
Western Digital Is Buying SanDisk For $19 Billion
Western Digital recently announced that it is buying up data storage vendor SanDisk for $19 billion in cash and stocks. This merger is coming at a time when the IT industry is evolving at breakneck speeds and companies are looking for new ways to get in on trends like wearable tech, the Internet of Things, and cloud computing. As a result, there has been a wave of mergers and acquisitions along with an increase of investment activity in the data storage market.
Western Digital's acquisition of SanDisk comes right after the market's biggest acquisition ever, when Dell purchased EMC for $67 billion. That deal hit news outlets last week and, in addition, storage semiconductor maker PMC-Sierra has also received a number of bids to take it over while Unisplendor, who is owned by China's Tsinghua, agreed to buy 15% of Western Digital for $3.78 billion. Western Digital focuses mainly on hard disk drives (HDD) though it is facing an evolution in IT that is pushing companies towards addressing a changing set of requirements for both client and enterprise end customers.
Enterprises no longer have to rely entirely on tape drives for backups and hard drives for primary data. However, the do have to deal with higher speed requirements that come with applications like online transaction processing and big data analytics. Solid-state drives (SSD) are a vital piece of multi-faceted storage infrastructures whereas flash memory devices simply sit under DRAM as top-tier storage. Western Digital and SanDisk are both based in California and they are both heavily involved in different segments of the consumer data storage market. Western Digital offers desktop NAS drives while SanDisk operates as a leader in flash-based thumb drives and memory expansion cards.
Earlier in the year, SanDisk, which is also known for its SSDs for desktops and laptops, announced its very first lineup of pocket-sized, high-capacity external drives. This buyout of SanDisk gives Western Digital an instant position in the global, non-volatile NAND flash memory market, according to Research Vice President of IDC Jeff Janukowicz. "Additionally, the NAND industry is at an infection point as it transitions from planar to 3D technology and access to that technology was a key piece of the deal," Janukowicz stated. "Now, WD is positioned to address a much larger footprint in the storage industry."
Western Digital noted during its announcement of the deal that the combination will "enable it to vertically integrate into NAND, securing long-term access to solid state technology at lower cost." SanDisk as 27 years of experience in the NAND flash memory industry and recently announced a deal with Toshiba to manufacture the world's densest 3D NAND, which is a 48-layer, 32GB chip that offers twice the capacity of the next densest memory. Also during the announcement Western Digital also noted the 15-year partnership between SanDisk and Toshiba and stated that it expects that relationship to be "ongoing". According to the company, "The joint venture provides stable NAND supply at scale through a time-tested business model and extends across NVM technologies such as 3D NAND."
According to Gregory Wong, an Analyst with Forward Insights, this deal between Western Digital and SanDisk will allow WD to enter the consumer SSD and enterprise SATA SSD market. "WD wants SanDisk for the access to the flash. Their PC HDD business is declining due to the weak PC market but also because SSDs are encroaching that space," Wong added. "Without access to NAND flash at cost, it would've increasingly been difficult to compete with NAND players in the enterprise space."
Content originally published here
Sharing this story on Social Media? Use these hashtags! #WesternDigital #SanDisk #HDD #SDD
Western Digital's acquisition of SanDisk comes right after the market's biggest acquisition ever, when Dell purchased EMC for $67 billion. That deal hit news outlets last week and, in addition, storage semiconductor maker PMC-Sierra has also received a number of bids to take it over while Unisplendor, who is owned by China's Tsinghua, agreed to buy 15% of Western Digital for $3.78 billion. Western Digital focuses mainly on hard disk drives (HDD) though it is facing an evolution in IT that is pushing companies towards addressing a changing set of requirements for both client and enterprise end customers.
Enterprises no longer have to rely entirely on tape drives for backups and hard drives for primary data. However, the do have to deal with higher speed requirements that come with applications like online transaction processing and big data analytics. Solid-state drives (SSD) are a vital piece of multi-faceted storage infrastructures whereas flash memory devices simply sit under DRAM as top-tier storage. Western Digital and SanDisk are both based in California and they are both heavily involved in different segments of the consumer data storage market. Western Digital offers desktop NAS drives while SanDisk operates as a leader in flash-based thumb drives and memory expansion cards.
Earlier in the year, SanDisk, which is also known for its SSDs for desktops and laptops, announced its very first lineup of pocket-sized, high-capacity external drives. This buyout of SanDisk gives Western Digital an instant position in the global, non-volatile NAND flash memory market, according to Research Vice President of IDC Jeff Janukowicz. "Additionally, the NAND industry is at an infection point as it transitions from planar to 3D technology and access to that technology was a key piece of the deal," Janukowicz stated. "Now, WD is positioned to address a much larger footprint in the storage industry."
Western Digital noted during its announcement of the deal that the combination will "enable it to vertically integrate into NAND, securing long-term access to solid state technology at lower cost." SanDisk as 27 years of experience in the NAND flash memory industry and recently announced a deal with Toshiba to manufacture the world's densest 3D NAND, which is a 48-layer, 32GB chip that offers twice the capacity of the next densest memory. Also during the announcement Western Digital also noted the 15-year partnership between SanDisk and Toshiba and stated that it expects that relationship to be "ongoing". According to the company, "The joint venture provides stable NAND supply at scale through a time-tested business model and extends across NVM technologies such as 3D NAND."
According to Gregory Wong, an Analyst with Forward Insights, this deal between Western Digital and SanDisk will allow WD to enter the consumer SSD and enterprise SATA SSD market. "WD wants SanDisk for the access to the flash. Their PC HDD business is declining due to the weak PC market but also because SSDs are encroaching that space," Wong added. "Without access to NAND flash at cost, it would've increasingly been difficult to compete with NAND players in the enterprise space."
Content originally published here
Sharing this story on Social Media? Use these hashtags! #WesternDigital #SanDisk #HDD #SDD
Friday, October 23, 2015
Did You Know About These Built-In Battery Saving Modes In Windows 10?
In case you didn't know, Windows 10 has a lot of neat little things built-in that allow you to save your battery life for longer. Sure there is the traditional power options menu that allows you to pick how long your computer should stay awake when you're not using it, but that's boring. What's cool is the new battery saver feature that switches off things like push notifications (Yes we know, your PC operating system has push notifications because Windows 10 was designed with mobile devices in mind too, get over it). So what do you need to know about these new battery settings?
The Power & Sleep menu, which can be accessed by going to Settings > System > Power > Sleep, isn't anything new. From this menu, you can choose how long your computer can be idle before it automatically shuts off the screen or goes into Sleep Mode. In addition to that, you can also customize it to do so when your computer is plugged in or running solely on battery power. There are also Additional Power Settings that you can click on, which opens up the Power Options menu in the Control Panel. From here you can edit your power plans, choose what happens when you close your laptop lid, or decide whether or not you need a password to unlock your computer when it wakes up from Sleep Mode.
A new feature with Windows 10 is Batter Saver. This is a battery-saving power mode that has been specifically optimized for Windows 10. This means that it can do things like limit background app activity and push notifications. This mode is very similar to ones that you would find on mobile devices, like smartphones or tablets. In addition to this, Batter Saver will automatically turn on when your laptop drops below 20% battery life though you can turn that off manually by going to Settings > System > Battery Saver and turning it off.
You can also tinker with Battery Saver mode. By going to the Battery Saver menu, click Battery Saver Settings to bring up the settings menu. From here you can adjust the point at which Batter Saver mode automatically kicks on from a range of 5% to 100% battery life. You can also choose whether or not to allow push notifications or lower screen brightness in Battery Saver mode or add app exceptions. Apps that you want to exclude from Battery Saver mode will be able to run in the background and send push notifications at all times.
The main Battery Saver menu allows you to see how much of your battery life is being used by different apps, simply click Battery Use to see. This will help you determine which apps to disable in Battery Saver mode, which is extremely useful. You can turn off the apps that drain the most battery life and not even worry about the ones that don't use any at all.
If you find yourself always needing to be connected when you're using your laptop and you can't figure out why your Windows 10 device is using so much power, consider going into some of these settings and tinkering around with things. You might be surprised at how much more efficient your battery usage will become and how much longer your laptop will last without having to be plugged in.
Content originally published here
The Power & Sleep menu, which can be accessed by going to Settings > System > Power > Sleep, isn't anything new. From this menu, you can choose how long your computer can be idle before it automatically shuts off the screen or goes into Sleep Mode. In addition to that, you can also customize it to do so when your computer is plugged in or running solely on battery power. There are also Additional Power Settings that you can click on, which opens up the Power Options menu in the Control Panel. From here you can edit your power plans, choose what happens when you close your laptop lid, or decide whether or not you need a password to unlock your computer when it wakes up from Sleep Mode.
A new feature with Windows 10 is Batter Saver. This is a battery-saving power mode that has been specifically optimized for Windows 10. This means that it can do things like limit background app activity and push notifications. This mode is very similar to ones that you would find on mobile devices, like smartphones or tablets. In addition to this, Batter Saver will automatically turn on when your laptop drops below 20% battery life though you can turn that off manually by going to Settings > System > Battery Saver and turning it off.
You can also tinker with Battery Saver mode. By going to the Battery Saver menu, click Battery Saver Settings to bring up the settings menu. From here you can adjust the point at which Batter Saver mode automatically kicks on from a range of 5% to 100% battery life. You can also choose whether or not to allow push notifications or lower screen brightness in Battery Saver mode or add app exceptions. Apps that you want to exclude from Battery Saver mode will be able to run in the background and send push notifications at all times.
The main Battery Saver menu allows you to see how much of your battery life is being used by different apps, simply click Battery Use to see. This will help you determine which apps to disable in Battery Saver mode, which is extremely useful. You can turn off the apps that drain the most battery life and not even worry about the ones that don't use any at all.
If you find yourself always needing to be connected when you're using your laptop and you can't figure out why your Windows 10 device is using so much power, consider going into some of these settings and tinkering around with things. You might be surprised at how much more efficient your battery usage will become and how much longer your laptop will last without having to be plugged in.
Content originally published here
Monday, October 12, 2015
HP Envy 34 All-In-One Has Brilliant 34-Inch Curved Display
HP has just released its newest all-in-one PC, the beautiful Enfy 34. This computer offers something unique, however, that other all-in-ones do not and that is a 34-inch curved panel. This is definitely a fancy computer to look at and, as a result, isn't cheap.
HP has added Intel's latest 6th-generation Skylake CPU and the screen itself is an IPS display, allowing for wide viewing angles. The screen also comes with a 3440 x 1440 resolution as well as 4.9 million pixels and a 21:9 aspect ratio. According to HP, the Envy 34's display offers 99% of the sRGB color gamut and also has Technicolor certification.
As far as CPU options are concerned, the Envy 34 comes with a Core i5 or Core i7 Skylake dual-core with integrated graphics standard. If this isn't quite good enough for you then you can always opt for the GeFOrce GTX 960A. Nvidia typically special OEM versions of its GPUs, which generally means that this version will be a bit slower than the consumer model.
In terms of RAM storage there is included 8GB or 16GB of DDR4 with storage options ranging from 128GB SSDs to as much as 2TB hard drives and hybrid drives. Naturally, the lower specs come with the base priced model of the Envy 34, which in this case will run you $1,800 USD.
If the Envy 34 seems a bit too big for you liking, the you will be happy to know that HP is also offering 27-inch and 24-inch models, known as the Envy 27 and Envy 24, respectively. Both of these versions mirror the internal hardware of the Envy 34, with the Skylake Core i5 and Core i7 chips as well as similar RAM and storage options. However, on these devices HP is giving consumers the option of AMD Radeon R7 or Radeon R9 graphics.
In addition to that, the panels on the Envy 27 and Envy 24 are flat, instead of curved. Both are Technicolor certified and have resolutions from standard 1080x1920 HD up to Ultra HD 4K. The Envy 24 starts at $1,000 with the Envy 27 starting at $1,200.
As far as all-in-ones are concerned, the Envy 34 from HP is definitely one of the best looking. What it's got underneath the hood is nothing to shake a stick at either. Solid RAM, solid storage space, solid processor, and a solid graphics card(s) are sure to make this all-in-one be at the top of many a computer lover's list.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #HP #AllInOne #Envy34 #CurvedScreen
HP has added Intel's latest 6th-generation Skylake CPU and the screen itself is an IPS display, allowing for wide viewing angles. The screen also comes with a 3440 x 1440 resolution as well as 4.9 million pixels and a 21:9 aspect ratio. According to HP, the Envy 34's display offers 99% of the sRGB color gamut and also has Technicolor certification.
As far as CPU options are concerned, the Envy 34 comes with a Core i5 or Core i7 Skylake dual-core with integrated graphics standard. If this isn't quite good enough for you then you can always opt for the GeFOrce GTX 960A. Nvidia typically special OEM versions of its GPUs, which generally means that this version will be a bit slower than the consumer model.
In terms of RAM storage there is included 8GB or 16GB of DDR4 with storage options ranging from 128GB SSDs to as much as 2TB hard drives and hybrid drives. Naturally, the lower specs come with the base priced model of the Envy 34, which in this case will run you $1,800 USD.
If the Envy 34 seems a bit too big for you liking, the you will be happy to know that HP is also offering 27-inch and 24-inch models, known as the Envy 27 and Envy 24, respectively. Both of these versions mirror the internal hardware of the Envy 34, with the Skylake Core i5 and Core i7 chips as well as similar RAM and storage options. However, on these devices HP is giving consumers the option of AMD Radeon R7 or Radeon R9 graphics.
In addition to that, the panels on the Envy 27 and Envy 24 are flat, instead of curved. Both are Technicolor certified and have resolutions from standard 1080x1920 HD up to Ultra HD 4K. The Envy 24 starts at $1,000 with the Envy 27 starting at $1,200.
As far as all-in-ones are concerned, the Envy 34 from HP is definitely one of the best looking. What it's got underneath the hood is nothing to shake a stick at either. Solid RAM, solid storage space, solid processor, and a solid graphics card(s) are sure to make this all-in-one be at the top of many a computer lover's list.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #HP #AllInOne #Envy34 #CurvedScreen
Tuesday, October 6, 2015
Amazon Looking To Cut Prices And Launch A High-Speed Database
Amazon is getting ready to host its annual AWS re:Invent tech conference next week in Las Vegas, with plans of talking to customers about its popular Amazon Web Services, the cloud computing that the online retailer provides. However, there is one product rumored to be announced at the event and that is a new, super-fast "in-memory" database, according to Merrill Lynch's Justin Post. According to Post, "Amazon may announce new database products like in-memory databases or higher performance database services like Aurora (MySQL)."
An in-memory database runs in your computer's memory instead of using computer storage. In addition to that, it is also capable of processing unspeakable amounts of data at nearly instantaneous speeds, according to the description from Oracle chairman Larry Ellison on Oracle's version of this very same product. the in-memory option is one of the key ways that Oracle is convincing its customers to upgrade to its latest database, Oracle 12c.
Another big name in this industry is SAP. SAP is trying to slowly wean its business software customers off of Oracle's database and onto its own in-memory alternative, known as Hana. SAP has wagered its entire company on the Hana database, according to SAP's chairman. In addition to that, Amazon already offers a plethora of ways to run in-memory databases on its cloud as well as a variety of its own databases.
Amazon has stated that it is working on more databases. In a job listing for a database developer the company said, "These are exciting times in our space - we are growing fast, but still at an early stage and working on ambitious new initiatives where an engineer at any level can have significant technical and business impact." Should Amazon introduce a new in-memory database, it won't be good news for either Oracle or SAP.
Databases are what an entire company's operations depend on. As a result, companies don't switch them out very often or very easily. However, database vendors are also known for some pretty wicked measures to get money out of their customers. As more and more businesses jump into cloud computing, a lot of them wouldn't mind finding less-expensive database alternatives. What are these less-expensive alternatives? Amazon.
Amazon is constantly cutting prices and, as of July, it has cut AWS prices 49 times and just announced another price cut for its storage service. Amazon is also due for even more cost cuts as rumors are circulating that the company will announce them at the AWS re:Invent tech conference.
This race to cut costs has generated a pretty catchy name: The Race to Zero. This means that, at some point, some cloud providers will cut prices so low that they will be giving them away for free. Google, earlier this year, actually went ahead and did that with its free Photo app and even threw in free unlimited storage too.
While Amazon delves deeper into database services and continues to cut costs along the way, enterprises may be very happy to give this new database a try.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #Amazon #AWS #Oracle #SAP #InMemoryDatabase
An in-memory database runs in your computer's memory instead of using computer storage. In addition to that, it is also capable of processing unspeakable amounts of data at nearly instantaneous speeds, according to the description from Oracle chairman Larry Ellison on Oracle's version of this very same product. the in-memory option is one of the key ways that Oracle is convincing its customers to upgrade to its latest database, Oracle 12c.
Another big name in this industry is SAP. SAP is trying to slowly wean its business software customers off of Oracle's database and onto its own in-memory alternative, known as Hana. SAP has wagered its entire company on the Hana database, according to SAP's chairman. In addition to that, Amazon already offers a plethora of ways to run in-memory databases on its cloud as well as a variety of its own databases.
Amazon has stated that it is working on more databases. In a job listing for a database developer the company said, "These are exciting times in our space - we are growing fast, but still at an early stage and working on ambitious new initiatives where an engineer at any level can have significant technical and business impact." Should Amazon introduce a new in-memory database, it won't be good news for either Oracle or SAP.
Databases are what an entire company's operations depend on. As a result, companies don't switch them out very often or very easily. However, database vendors are also known for some pretty wicked measures to get money out of their customers. As more and more businesses jump into cloud computing, a lot of them wouldn't mind finding less-expensive database alternatives. What are these less-expensive alternatives? Amazon.
Amazon is constantly cutting prices and, as of July, it has cut AWS prices 49 times and just announced another price cut for its storage service. Amazon is also due for even more cost cuts as rumors are circulating that the company will announce them at the AWS re:Invent tech conference.
This race to cut costs has generated a pretty catchy name: The Race to Zero. This means that, at some point, some cloud providers will cut prices so low that they will be giving them away for free. Google, earlier this year, actually went ahead and did that with its free Photo app and even threw in free unlimited storage too.
While Amazon delves deeper into database services and continues to cut costs along the way, enterprises may be very happy to give this new database a try.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #Amazon #AWS #Oracle #SAP #InMemoryDatabase
Monday, September 14, 2015
Microsoft Gives You The Windows 10 Upgrade Whether You Want It Or Not!
Microsoft recently confirmed that it has been pre-loading Windows 10 installation bits onto devices whose owners have not "reserved" a copy of the operating system, let alone showed any interest in it at all. Naturally, this has upset some users of Windows 7 and Windows 8.1 with many complaining that the unsolicited downloads have caused them to exceed data caps from their internet service providers or seized storage space without their consent.
Microsoft released a statement acknowledging the downloads stating, "For those who have chosen to receive automatic updates through Windows Update, we help customers prepare their devices for Windows 10 by downloading the files necessary for future installation. This results in a better upgrade experience and ensures the customer's device has the latest software. This is an industry practice that reduces the time for installation and ensures device readiness."
If a Windows 7 or Windows 8.1 user has Windows Update set to the default option, which was recommended by Microsoft, that allows the operating system to download and install security and other bug patches automatically in the background then Microsoft will push the Windows 10 upgrade files to the drive.
This upgrade can range from over 3GB to almost 6GB and is placed in the hidden "$Windows.~BT" folder. This folder has long been a destination for Windows upgrades and the Windows 10 upgrade will remain here until the user expresses an interest in installing the operating system...at least that's what we hope.
Microsoft has been pre-loading the Windows 10 upgrade on systems since the end of July, though it was believed that the practice was limited to PCs whose users had accepted Microsoft's free offer and reserved a copy of the operating system through an app the company automatically installed back in the spring and early summer on nearly every single PC running Windows 7 Home and Windows 8.1 Home, and on many PCs running Windows 7 Professional and Windows 8.1 Professional.
Once the Windows 10 upgrade was downloaded to the device, the user was notified via the app that installation was ready. But this new scheme is completely different in the fact that the bits are downloaded to the PC, regardless of the fact that the user has not asked for the upgrade whatsoever. What's not surprising is the people who noticed this first were the ones with data caps mandated by their internet service providers, especially those who relied on cellular connection to the internet.
There is a particularly long thread on Slashdot that has several commenters claiming that they had exceeded their caps because Microsoft downloaded this massive update to their devices without their approval. One comment reads, "I had to travel recently, so I took a laptop with clean Windows 8.1 Pro install. At my destination, I purchased a SIM (they only had 1GB data packages) and put it into the 3G/Wi-Fi router I carry. I powered the laptop, connected to Internet via said router, checked a few things, then went away for a few hours. When I got back to the apartment, my data package (and Internet connectivity) was killed because Microsoft idiots decided to start downloading Windows 10 even though I have explicitly closed/rejected all the offers."
Other users did not appreciate the unwanted upgrade that landed on their limited storage space. Anyone with a 128GB SSD would be very unhappy if 5% of their storage capacity was suddenly occupied without their approval. Others wondered whether Microsoft would take the next logical step by either giving users notifications telling them to apply the installed upgrade or make the move of triggering the download automatically.
If they triggered the download automatically it wouldn't be much different from what they've already done with those users who accepted the free upgrade and reserved a copy. It is also possible that a lot of users on the receiving end of the notifications would approve of the upgrade or even appreciate the fact they didn't have to wait a long time for the download to complete. However, if Microsoft downloaded the update without consent (again) then the people may very well grab their torches and pitchforks.
Content originally published here
Sharing this story on Social Media? Use these hashstags! #Microsoft #Windows10 #WindowsUpdate
Microsoft released a statement acknowledging the downloads stating, "For those who have chosen to receive automatic updates through Windows Update, we help customers prepare their devices for Windows 10 by downloading the files necessary for future installation. This results in a better upgrade experience and ensures the customer's device has the latest software. This is an industry practice that reduces the time for installation and ensures device readiness."
If a Windows 7 or Windows 8.1 user has Windows Update set to the default option, which was recommended by Microsoft, that allows the operating system to download and install security and other bug patches automatically in the background then Microsoft will push the Windows 10 upgrade files to the drive.
This upgrade can range from over 3GB to almost 6GB and is placed in the hidden "$Windows.~BT" folder. This folder has long been a destination for Windows upgrades and the Windows 10 upgrade will remain here until the user expresses an interest in installing the operating system...at least that's what we hope.
Microsoft has been pre-loading the Windows 10 upgrade on systems since the end of July, though it was believed that the practice was limited to PCs whose users had accepted Microsoft's free offer and reserved a copy of the operating system through an app the company automatically installed back in the spring and early summer on nearly every single PC running Windows 7 Home and Windows 8.1 Home, and on many PCs running Windows 7 Professional and Windows 8.1 Professional.
Once the Windows 10 upgrade was downloaded to the device, the user was notified via the app that installation was ready. But this new scheme is completely different in the fact that the bits are downloaded to the PC, regardless of the fact that the user has not asked for the upgrade whatsoever. What's not surprising is the people who noticed this first were the ones with data caps mandated by their internet service providers, especially those who relied on cellular connection to the internet.
There is a particularly long thread on Slashdot that has several commenters claiming that they had exceeded their caps because Microsoft downloaded this massive update to their devices without their approval. One comment reads, "I had to travel recently, so I took a laptop with clean Windows 8.1 Pro install. At my destination, I purchased a SIM (they only had 1GB data packages) and put it into the 3G/Wi-Fi router I carry. I powered the laptop, connected to Internet via said router, checked a few things, then went away for a few hours. When I got back to the apartment, my data package (and Internet connectivity) was killed because Microsoft idiots decided to start downloading Windows 10 even though I have explicitly closed/rejected all the offers."
Other users did not appreciate the unwanted upgrade that landed on their limited storage space. Anyone with a 128GB SSD would be very unhappy if 5% of their storage capacity was suddenly occupied without their approval. Others wondered whether Microsoft would take the next logical step by either giving users notifications telling them to apply the installed upgrade or make the move of triggering the download automatically.
If they triggered the download automatically it wouldn't be much different from what they've already done with those users who accepted the free upgrade and reserved a copy. It is also possible that a lot of users on the receiving end of the notifications would approve of the upgrade or even appreciate the fact they didn't have to wait a long time for the download to complete. However, if Microsoft downloaded the update without consent (again) then the people may very well grab their torches and pitchforks.
Content originally published here
Sharing this story on Social Media? Use these hashstags! #Microsoft #Windows10 #WindowsUpdate
Tuesday, September 8, 2015
IBM Creates A Limitless Linux
With significant drivers expanding the mission critical applications in the industry, it's no surprise that Linux is the fastest growing operating system in the business. Speed, agility, a unified development environment and cost are all factors to this success. In addition to that, Linux's quality has advanced in recent years with mobile driving an increased focus on this platform. IBM has a lot of eggs in this basket, pushing its z Systems platform to take advantage of this trend and provide the strongest large scale offering around. IBM is convinced that they can expand their capability by taking Linux to new heights and are doing so by announcing the expansion of their coverage. Customers have begun wanting IBM to take Linux and give it the same focus as the company's most capable systems, and that's exactly what IBM is doing.
IBM's LinuxOne Portfolio is designed to provide a selection of Linux distribution of its choosing and with the scale and support you would expect from IBM's most powerful systems, like the z System. In addition to that, there are two other offerings, both of which are named after penguins. These two systems are known as the LinuxONE Emperor and the LinuxONE Rockhopper. Emperor is the premier offering, providing the greatest flexibility and scalability along with performance and trust for business-critical Linux applications. Emperor also has the greatest capacity range and no top end, meaning it can be expanded whenever and however you need to. Rockhopper is the entry system that starts much smaller than Emperor and provides a solution for a smaller company or unit.
Both of the systems allow you to choose your distribution, hypervisor, runtime, management tools, your choice of databases, and your choice of analytics. There are other coverages on hypervisors as well, including PR/SM. z/VM, and KVM. Languages include Python, Pearl, COBOL, Java, and Node.js. On the management side, we have WAVE, IBM Cloud Manager, OpenStack, Docker, Chef, Puppet, and VMware vRealize Automation. In terms of databases, there's Oracle, DB2LUW, Maria DB, MongoDB, and PostgreSQL with analytics include Hadoop, Big Insights, DB@BLU, and Spark. The range of offerings is also pretty good, including single platform, multi-platform, on premise, off premise, and multiple mixed cloud environments with a common toolset.
When it comes to pricing, things are pretty flexible. There's a Pay-Per-Use model with no upfront payment needed, a fixed monthly or quarterly payment, and variable costing that scales based on usage. A second pricing model uses a 36-month fixed lease with a 36-month usage contact and a right to return after 1 year. The final payment style is a Pre-Core rental model that allows you to order what you need when you need it, add licenses as needed, decrease licenses as needed or cancel with a 30-day notice.
Scale out and up supports up to 8 thousand virtual servers in a single system along with tens of thousands of containers, tens of thousands of concurrent users, and the ability to run test, development, and production in a single system. There is a significant focus on speed here as IBM has included its fastest processor, biggest I/O pipe, up to 10TB of memory, and 4 levels of cache, all of which brings you a sub-second end user response time at full load. The platform is designed to run at 100% utilization and to spin containers and virtual servers in a matter of minutes, automatically provide physical resources as needed in seconds via automatic resources provisioning and reallocation.
IBM testing has shown that it can handle up to 30 billion RESTful web interactions per day with up to 350,000 database read and writes per second. This is twice the performance of competitors for any NoSQL database, along with the longest single DB node. This assists in avoiding costs, complexity, and overhead of sharding.
This is Linux without limits. The Open Mainframe Project is a key element in the effort and it is being pushed forward by the Linux foundation in order to drive capabilities and growth. IBM is showing a commitment to driving this project with a lot of funding and company resources. This includes IBM Linux Technology Centers, Open Source Community Contributions, Academic Initiative and Training Programs, and open access to Mainframe community clouds. Members of this project include CA, CompuWare, BMC, Marris College, and the University of Washington.
IBM is pulling out all the stops here and this is definitely one of the most significant moves by the company in the past 10 years. IBM is enabling its centers worldwide to port, test, and benchmark key applications with free access to the Mainframe Community Cloud for developers and students across a plethora of top-tier universities. IBM definitely sees its future tied closely to Linux and a score of students coming from educational institutions and is focused on ever more mobile, flexible, applications.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #IBM #Linux #LimitlessLinux #LinuxONERockhopper LinuxONEEmperor
IBM's LinuxOne Portfolio is designed to provide a selection of Linux distribution of its choosing and with the scale and support you would expect from IBM's most powerful systems, like the z System. In addition to that, there are two other offerings, both of which are named after penguins. These two systems are known as the LinuxONE Emperor and the LinuxONE Rockhopper. Emperor is the premier offering, providing the greatest flexibility and scalability along with performance and trust for business-critical Linux applications. Emperor also has the greatest capacity range and no top end, meaning it can be expanded whenever and however you need to. Rockhopper is the entry system that starts much smaller than Emperor and provides a solution for a smaller company or unit.
Both of the systems allow you to choose your distribution, hypervisor, runtime, management tools, your choice of databases, and your choice of analytics. There are other coverages on hypervisors as well, including PR/SM. z/VM, and KVM. Languages include Python, Pearl, COBOL, Java, and Node.js. On the management side, we have WAVE, IBM Cloud Manager, OpenStack, Docker, Chef, Puppet, and VMware vRealize Automation. In terms of databases, there's Oracle, DB2LUW, Maria DB, MongoDB, and PostgreSQL with analytics include Hadoop, Big Insights, DB@BLU, and Spark. The range of offerings is also pretty good, including single platform, multi-platform, on premise, off premise, and multiple mixed cloud environments with a common toolset.
When it comes to pricing, things are pretty flexible. There's a Pay-Per-Use model with no upfront payment needed, a fixed monthly or quarterly payment, and variable costing that scales based on usage. A second pricing model uses a 36-month fixed lease with a 36-month usage contact and a right to return after 1 year. The final payment style is a Pre-Core rental model that allows you to order what you need when you need it, add licenses as needed, decrease licenses as needed or cancel with a 30-day notice.
Scale out and up supports up to 8 thousand virtual servers in a single system along with tens of thousands of containers, tens of thousands of concurrent users, and the ability to run test, development, and production in a single system. There is a significant focus on speed here as IBM has included its fastest processor, biggest I/O pipe, up to 10TB of memory, and 4 levels of cache, all of which brings you a sub-second end user response time at full load. The platform is designed to run at 100% utilization and to spin containers and virtual servers in a matter of minutes, automatically provide physical resources as needed in seconds via automatic resources provisioning and reallocation.
IBM testing has shown that it can handle up to 30 billion RESTful web interactions per day with up to 350,000 database read and writes per second. This is twice the performance of competitors for any NoSQL database, along with the longest single DB node. This assists in avoiding costs, complexity, and overhead of sharding.
This is Linux without limits. The Open Mainframe Project is a key element in the effort and it is being pushed forward by the Linux foundation in order to drive capabilities and growth. IBM is showing a commitment to driving this project with a lot of funding and company resources. This includes IBM Linux Technology Centers, Open Source Community Contributions, Academic Initiative and Training Programs, and open access to Mainframe community clouds. Members of this project include CA, CompuWare, BMC, Marris College, and the University of Washington.
IBM is pulling out all the stops here and this is definitely one of the most significant moves by the company in the past 10 years. IBM is enabling its centers worldwide to port, test, and benchmark key applications with free access to the Mainframe Community Cloud for developers and students across a plethora of top-tier universities. IBM definitely sees its future tied closely to Linux and a score of students coming from educational institutions and is focused on ever more mobile, flexible, applications.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #IBM #Linux #LimitlessLinux #LinuxONERockhopper LinuxONEEmperor
Monday, August 31, 2015
New Solid-State Batteries From Samsung/MIT Could "Last A Lifetime"
Batteries. We need them for almost everything we use. They're in our laptops, our smartphones, our video game controllers, our remotes, and anything we use that is now wireless. But the one thing wrong with batteries in today's world is that they are finite. Eventually, they will run out of power and there's not a whole lot we can do about that. Or is there?
Researchers have recently developed a new material for a basic battery component that, according to them, will allow almost any battery indefinite power storage. This new material, known as a solid electrolyte, could increase battery life as well as battery storage capacity and safety as liquid electrolytes are the leading cause of battery fires.
The standard lithium-ion batteries that we use today use a liquid electrolyte. This liquid electrolyte is an organic solvent that has been known to overheat and cause fires in things like cars, commercial airliners and even smartphones. With a solid electrolyte, there is absolutely no safety problem whatsoever.
According to Gerbrand Ceder, a professor of materials science and engineering at MIT and one of the main researchers on the project, "You could throw it against the wall, drive a nail through it - there's nothing there to burn." In addition to that, a solid-state electrolyte will have virtually no degradation, which means that such batteries could last through "hundreds of thousands of cycles," Ceder continued.
Organic electrolytes also have limited electrochemical stability, which means that they lose their ability to produce an electrical charge over time. In addition to MIT, scientists from the Samsung Advanced Institute of Technology, the University of California at San Diego and the University of Maryland also conducted research on the project.
The findings were published in the peer-reviewed journal Nature Materials and the researchers described the solid-state electrolytes as an improvement over the current lithium-ion batteries we are using today. Electrolytes are one of three main components in a battery along with anode and cathode terminals.
The electrolyte component of the battery separates the battery's positive cathode and negative anode terminals while allowing the flow of ions between terminals. A chemical reaction then takes place between the two terminals, producing an electric current.
Previous problems with solid electrolytes are that they were incapable of conducting ions fast enough to be efficient energy, producers. The team of researchers from MIT and Samsung say that they have overcome that problem. Another advantage of a solid-state lithium-ion battery is that it can perform under very cold temperatures with Ceder calling this breakthrough "a real game-changer" that creates an "almost perfect battery".
Content originally published here
Sharing this story on Social Media? Use these hashtags! #MIT #Samsung #Batteries #SolidStateBatteries
Researchers have recently developed a new material for a basic battery component that, according to them, will allow almost any battery indefinite power storage. This new material, known as a solid electrolyte, could increase battery life as well as battery storage capacity and safety as liquid electrolytes are the leading cause of battery fires.
The standard lithium-ion batteries that we use today use a liquid electrolyte. This liquid electrolyte is an organic solvent that has been known to overheat and cause fires in things like cars, commercial airliners and even smartphones. With a solid electrolyte, there is absolutely no safety problem whatsoever.
According to Gerbrand Ceder, a professor of materials science and engineering at MIT and one of the main researchers on the project, "You could throw it against the wall, drive a nail through it - there's nothing there to burn." In addition to that, a solid-state electrolyte will have virtually no degradation, which means that such batteries could last through "hundreds of thousands of cycles," Ceder continued.
Organic electrolytes also have limited electrochemical stability, which means that they lose their ability to produce an electrical charge over time. In addition to MIT, scientists from the Samsung Advanced Institute of Technology, the University of California at San Diego and the University of Maryland also conducted research on the project.
The findings were published in the peer-reviewed journal Nature Materials and the researchers described the solid-state electrolytes as an improvement over the current lithium-ion batteries we are using today. Electrolytes are one of three main components in a battery along with anode and cathode terminals.
The electrolyte component of the battery separates the battery's positive cathode and negative anode terminals while allowing the flow of ions between terminals. A chemical reaction then takes place between the two terminals, producing an electric current.
Previous problems with solid electrolytes are that they were incapable of conducting ions fast enough to be efficient energy, producers. The team of researchers from MIT and Samsung say that they have overcome that problem. Another advantage of a solid-state lithium-ion battery is that it can perform under very cold temperatures with Ceder calling this breakthrough "a real game-changer" that creates an "almost perfect battery".
Content originally published here
Sharing this story on Social Media? Use these hashtags! #MIT #Samsung #Batteries #SolidStateBatteries
Acer's Aspire Z3-710 All-In-One Gets Windows 10
On Monday, Acer announced that its 23.8-inch Aspire Z3-710 Series of all-in-one desktop PCs are saving you some time by shipping with Windows 10. so you won't have to do the free upgrade from Windows 8.1.
The Z3-710-UR55 at $750 and Z3-710-UR54 at $900 will be the two models that ship with Windows 10. The Z3-710-UR55 comes with an Intell Core i3-4170T dual-core processor clocked at 3.2GHz with 3MB of cache, 6GB of DDR3L RAM, 1TB hard drive, DVD writer, 802.11ac Wi-Fi, Bluetooth 4.0 + LE, 1080p webcam, three USB 2.0 ports, two USB 3.0 ports, GbE LAN, and stereo speakers.
The Z3-710-UR53 has a Core i5-4590T quad-core processor clocked at 2GHz with 6MB of cache and 8GB of RAM. If you're just using general purpose computing chores, it might not be worth the additional $150 as you only get 2GB of additional RAM. Even with the processor being an upgrade in cores and cache, it really is a downgrade in clock speed.
Looking for a Quote on a Desktop Rental for Your Business Event? Rentacomputer has a full range of Desktop Rentals for your event! Our short-term rentals are available in over 1,500 cities nationwide!
Acer says that both systems come with a slim 1.4-inch chassis with a display that can tilt from 5 to 25 degrees using only two fingers. If these don't sound as appealing to you, Acer still offers the Z3-710-UR59, which is a Windows 8.1 model, only at $700. It has an Intel Pentium G3260T dual-core clocked at 2.9GHz, 4GB of RAM, and 1TB HDD. All of the systems sound pretty cool and the fact that they are All-in-Ones is a definite bonus. You just need to find which one bests suits your needs.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #Acer #Aspire #AllinOne
The Z3-710-UR55 at $750 and Z3-710-UR54 at $900 will be the two models that ship with Windows 10. The Z3-710-UR55 comes with an Intell Core i3-4170T dual-core processor clocked at 3.2GHz with 3MB of cache, 6GB of DDR3L RAM, 1TB hard drive, DVD writer, 802.11ac Wi-Fi, Bluetooth 4.0 + LE, 1080p webcam, three USB 2.0 ports, two USB 3.0 ports, GbE LAN, and stereo speakers.
The Z3-710-UR53 has a Core i5-4590T quad-core processor clocked at 2GHz with 6MB of cache and 8GB of RAM. If you're just using general purpose computing chores, it might not be worth the additional $150 as you only get 2GB of additional RAM. Even with the processor being an upgrade in cores and cache, it really is a downgrade in clock speed.
Looking for a Quote on a Desktop Rental for Your Business Event? Rentacomputer has a full range of Desktop Rentals for your event! Our short-term rentals are available in over 1,500 cities nationwide!
Acer says that both systems come with a slim 1.4-inch chassis with a display that can tilt from 5 to 25 degrees using only two fingers. If these don't sound as appealing to you, Acer still offers the Z3-710-UR59, which is a Windows 8.1 model, only at $700. It has an Intel Pentium G3260T dual-core clocked at 2.9GHz, 4GB of RAM, and 1TB HDD. All of the systems sound pretty cool and the fact that they are All-in-Ones is a definite bonus. You just need to find which one bests suits your needs.
Content originally published here
Sharing this story on Social Media? Use these hashtags! #Acer #Aspire #AllinOne
Wednesday, August 12, 2015
Most Popular Programming Languages State by State
When it comes to programming languages, most people probably can't name more than 2. Popular answers to this question typically include languages like C++ or JavaScript, but there is a whole host of other languages out there for people to use. And just because you've never heard of one doesn't mean that it isn't the most popular programming language on the other side of the country.
Silicon Valley may be the hotbed of new and exciting tech, but most programmers and developers are working hard in other industries, like big business. A recent programmer Q&A from the site Experts Exchange delved deep into its own data to determine what exactly the most popular programming languages are in the United States.
The survey took a look at who was asking questions about which programming languages, which was one of the factors in determining which states used which programming languages. In addition to that, users who were qualified "Experts" on the site seemed to favor PHP heavily.
If you look at the data and simply go off of the number of questions asked then it's clear to see that Microsoft's .NET programming language is seeing increased use. However, most people in the business are not surprised by these results.
PHP is a script language that is pretty much the standard with web developers, despite the fact that a lot of programmers don't seem to like all that much. Going along with that, .NET is a Microsoft standard that isn't exactly the most talked about but it is a good entrance for coders who are just getting started with learning to build apps around Microsoft platforms, platforms which are widely used in the business world.
What this data also does is show those in the Silicon Valley that the popularity of the newest programming languages is insignificant when compared to the driving force of existing options. Some of the more trendy languages, like Ruby on Rails and Swift, don't even appear on the list.
So what are the most popular programming languages in the country? Here's the list:
Just to note, West Virginia and North Dakota didn't return any significant data....coders must not exist there.
Content originally published here
Silicon Valley may be the hotbed of new and exciting tech, but most programmers and developers are working hard in other industries, like big business. A recent programmer Q&A from the site Experts Exchange delved deep into its own data to determine what exactly the most popular programming languages are in the United States.
The survey took a look at who was asking questions about which programming languages, which was one of the factors in determining which states used which programming languages. In addition to that, users who were qualified "Experts" on the site seemed to favor PHP heavily.
If you look at the data and simply go off of the number of questions asked then it's clear to see that Microsoft's .NET programming language is seeing increased use. However, most people in the business are not surprised by these results.
PHP is a script language that is pretty much the standard with web developers, despite the fact that a lot of programmers don't seem to like all that much. Going along with that, .NET is a Microsoft standard that isn't exactly the most talked about but it is a good entrance for coders who are just getting started with learning to build apps around Microsoft platforms, platforms which are widely used in the business world.
What this data also does is show those in the Silicon Valley that the popularity of the newest programming languages is insignificant when compared to the driving force of existing options. Some of the more trendy languages, like Ruby on Rails and Swift, don't even appear on the list.
So what are the most popular programming languages in the country? Here's the list:
- ASP
- Cold Fusion
- C++
- C#
- Delphi
- Java
- JavaJ2EE
- JavaScript
- .NET
- PHP
- Powershell
- Python
- Shell
- SQL
- VisualBasic
Just to note, West Virginia and North Dakota didn't return any significant data....coders must not exist there.
Content originally published here
Labels:
.NET,
ASP,
C#,
C++,
Cold Fusion,
Delphi,
Experts Exchange,
Java,
JavaJ2EE,
JavaScript,
PHP,
Powershell,
programming languages,
Python,
Shell,
SQL,
VisualBasic
Friday, July 10, 2015
Google Looking For Redemption With Google Glass For Enterprise
I guess the Science Fiction-esque future that Google Glass suggested in its ads is a little bit further ahead of us than Google realized. Google Glasses were all but a huge consumer flop. The general public just wasn't ready for goofy-looking, augmented reality glasses.
But that doesn't mean this product is dead, at least not yet. According to a recent FCC filing discovery, 9to5Google revealed the next version of Google Glass. This edition will be geared towards applications that are used in the enterprise space, leaving out the novelty consumer-oriented functions that were featured in the first "Explorer Edition".
The "Enterprise Edition" Google Glass is said to sport a larger prism display for a better augmented experience. This serves as an attempt to minimize eye strain many early glass users complained about.
Additionally the Enterprise Edition will drop the Texas Instruments processor for an Intel Atom processor which will be faster and have better battery life than existing Android Wear smartwatches.
While the Explorer Edition often overheated or ran out of batteries, Enterprise Edition promises to run cooler, even with the additional external battery pack Google is experimenting with.
The last thing cited in the report was a super-fast 802.11ac Wi-Fi with dual-band support for 2.4GHz and 5GHz wireless channels, meaning even faster video streaming.
Google's decision to shift their Glass strategy towards enterprise customers is a huge, but intelligent shift. Yes, Google, assisting with specialized medical, law enforcement, or even business applications seems more important than letting consumers play virtual reality games.
Content originally published here
But that doesn't mean this product is dead, at least not yet. According to a recent FCC filing discovery, 9to5Google revealed the next version of Google Glass. This edition will be geared towards applications that are used in the enterprise space, leaving out the novelty consumer-oriented functions that were featured in the first "Explorer Edition".
The "Enterprise Edition" Google Glass is said to sport a larger prism display for a better augmented experience. This serves as an attempt to minimize eye strain many early glass users complained about.
Additionally the Enterprise Edition will drop the Texas Instruments processor for an Intel Atom processor which will be faster and have better battery life than existing Android Wear smartwatches.
While the Explorer Edition often overheated or ran out of batteries, Enterprise Edition promises to run cooler, even with the additional external battery pack Google is experimenting with.
The last thing cited in the report was a super-fast 802.11ac Wi-Fi with dual-band support for 2.4GHz and 5GHz wireless channels, meaning even faster video streaming.
Google's decision to shift their Glass strategy towards enterprise customers is a huge, but intelligent shift. Yes, Google, assisting with specialized medical, law enforcement, or even business applications seems more important than letting consumers play virtual reality games.
Content originally published here
Monday, June 29, 2015
Crytek's CryEngine Embraces Linux
Linux gaming is starting to catch on and build up some momentum. Following in the footsteps of Valve’s Source engine, Epic’s Unreal Engine 4, and Unity 5, Crytek's CryEngine supports Linux. This also means that it will have support for SteamOS. This also means that it will be way easier for developers who are currently making games on these engines to add support for Linux.
Even with this, developers will still have to go a little out of their way and do some work in order to add Linux support to their Steam games, so every game that comes out won't have it. So don't get your hopes up on that. But either way, there will be a lot of titles coming out in the future and the technology will become more widely adopted. It reduces the effort needed by a lot.
This might not be huge news to all of the indie game players out there. Smaller companies might not want to invest the extra time into adding support for Linux, but for the huge, new AAA games the cost of porting them to Linux goes way down, and because SteamOS is a really promising, big new platform, it's starting to look like a much better idea to these big gaming companies. When the core engine of the game already supports that platform, everything else is pretty simple because all the hard work is already done.
On top of all that, engines that already support Linux should get a huge improvement in the quality of ports. Some of the Linux games currently on Steam use a lot of Windows coding and Direct3D, which makes performance a little bit problematic for Linux users. This new change will mean that developers can do away with whatever tricks they were using to make Windows code run (badly) on top of Linux.
Content originally published here
Even with this, developers will still have to go a little out of their way and do some work in order to add Linux support to their Steam games, so every game that comes out won't have it. So don't get your hopes up on that. But either way, there will be a lot of titles coming out in the future and the technology will become more widely adopted. It reduces the effort needed by a lot.
This might not be huge news to all of the indie game players out there. Smaller companies might not want to invest the extra time into adding support for Linux, but for the huge, new AAA games the cost of porting them to Linux goes way down, and because SteamOS is a really promising, big new platform, it's starting to look like a much better idea to these big gaming companies. When the core engine of the game already supports that platform, everything else is pretty simple because all the hard work is already done.
On top of all that, engines that already support Linux should get a huge improvement in the quality of ports. Some of the Linux games currently on Steam use a lot of Windows coding and Direct3D, which makes performance a little bit problematic for Linux users. This new change will mean that developers can do away with whatever tricks they were using to make Windows code run (badly) on top of Linux.
Content originally published here
Wednesday, June 3, 2015
Avago Acquiring Broadcom for $37 Billion
According to Avango Technologies, they are ready to buy out Broadcom for a whopping $37 billion. That is a huge amount of money that you could probably buy anything you ever wanted with, and Bloomberg says it is the biggest tech deal to ever be made. Avango said that after the deal is done, the combined worth of the companies will be $77 billion.
The new company is going to be called Broadcom LTd, and it will be headed by Hock Tan, the CEO of Avango. Right behind companies like Intel, Samsung, TSMC, Qualcomm, and Micron, Broadcom would be the 6th largest semiconductor company in the world.
Many people don't really know about Avango, but they started out as a division of Hewlett-Packard before they split off into their own company years later. And everyone is pretty familiar with HP. Avango specializes in offering products for wireless communications, wired infrastructure, enterprise storage, and industrial applications. Broadcom is mainly known for their chips for communications devices and for their video solutions. They also make the chips for the popular Raspberry Pi computers.
The chip industry has already been privy to big moves like this. Just a couple of months ago NXP announced that it was planning on acquiring Freescale for just under $17 billion. It's too soon to see how the chip industry will be affected or what's to come from this new acquisition but as soon as details drop you'll find them here on A Computer Blog.
Content originally published here
The new company is going to be called Broadcom LTd, and it will be headed by Hock Tan, the CEO of Avango. Right behind companies like Intel, Samsung, TSMC, Qualcomm, and Micron, Broadcom would be the 6th largest semiconductor company in the world.
Many people don't really know about Avango, but they started out as a division of Hewlett-Packard before they split off into their own company years later. And everyone is pretty familiar with HP. Avango specializes in offering products for wireless communications, wired infrastructure, enterprise storage, and industrial applications. Broadcom is mainly known for their chips for communications devices and for their video solutions. They also make the chips for the popular Raspberry Pi computers.
Get a Quote on a Server Rental today from Rentacomputer! If your server is down or out for repairs then a Server Rental from Rentacomputer is the perfect solution!
The chip industry has already been privy to big moves like this. Just a couple of months ago NXP announced that it was planning on acquiring Freescale for just under $17 billion. It's too soon to see how the chip industry will be affected or what's to come from this new acquisition but as soon as details drop you'll find them here on A Computer Blog.
Content originally published here
Tuesday, June 2, 2015
Google Holding Ubiquitous Computing Summit This Fall
The Google I/O developers conference is what most people look forward to from Google every year. Even though that event has already come and gone, that doesn't mean there isn't anything left to look forward to from the company for the rest of the year. Google just announced that it will be holding a Ubiquitous Computing Summit this fall in San Francisco, California.
Just basing an idea off the name of the event won't get you anywhere as it isn't very descriptive. But the event will focus on the idea of making it easier to use software across a lot of different devices and form factors. The idea is that software should be universal across different things like smartphones, tablets, TVs, smartwatches, a car, etc....
On the developers end of the idea, they are trying to make all of these devices run the same universal software without having to change any of the code. A Google developer has also said that the summit will also focus on working on context-aware apps that will know which device is running them, where it is running them, how it is using them, and all kinds of other stuff. It is pretty interesting. They are now working on setting up guidelines for developing the software as well.
The idea isn't so new. Google has been talking about doing this type of thing for years now. All of the different versions of Android, like Lollipop and Jelly Bean, were all said to be steps toward unifying the Android experience across all of the different devices. Over the past year, Google has brought together all of their Android development kits for all of the different form factors. Even Microsoft is jumping on the bandwagon, and they are making Windows 10 to run not only on PCs, but on all of their smartphones and tablets, and even on the Xbox One.
The Ubiquitous Computing Summit doesn't have an exact date yet, just that the summit will be held this Fall in San Francisco. But as there is more information surfacing, you will be sure to find it here.
Content originally published here
Just basing an idea off the name of the event won't get you anywhere as it isn't very descriptive. But the event will focus on the idea of making it easier to use software across a lot of different devices and form factors. The idea is that software should be universal across different things like smartphones, tablets, TVs, smartwatches, a car, etc....
On the developers end of the idea, they are trying to make all of these devices run the same universal software without having to change any of the code. A Google developer has also said that the summit will also focus on working on context-aware apps that will know which device is running them, where it is running them, how it is using them, and all kinds of other stuff. It is pretty interesting. They are now working on setting up guidelines for developing the software as well.
Interested in a Quote on a Tablet Rental for the Google Summit? Rentacomputer has all the latest in Tablet Rental technology, including the Google Nexus!
The idea isn't so new. Google has been talking about doing this type of thing for years now. All of the different versions of Android, like Lollipop and Jelly Bean, were all said to be steps toward unifying the Android experience across all of the different devices. Over the past year, Google has brought together all of their Android development kits for all of the different form factors. Even Microsoft is jumping on the bandwagon, and they are making Windows 10 to run not only on PCs, but on all of their smartphones and tablets, and even on the Xbox One.
The Ubiquitous Computing Summit doesn't have an exact date yet, just that the summit will be held this Fall in San Francisco. But as there is more information surfacing, you will be sure to find it here.
Content originally published here
Tuesday, May 26, 2015
Interest in Big Data Increasing
Big data, as we know it, hit the scene in about 2001 so it is nothing new. Companies have been gathering this data for years, but recently there has been an explosion of interest in it like it's a brand new thing. This is mainly due to the fact that there are now new ways of analyzing the data.
Big data started out as a way to store a ton of information, but now it has transformed into something where companies are trying to use it in any way they can to benefit from it in different ways. So here is a list of a few reasons why this is happening.
1. Unstructured data is now readily available
Unlike traditional business insight that analyzes structured data, big data focuses on the unstructured data. This includes emails, videos, photos and posts on social networking sites. Millions of photos get uploaded to Facebook, millions of tweets go out every day, and businesses can use this information to understand their customers better. This helps with suggested sales and such.
2. It has become incredibly cheap to store huge amounts of data
Unstructured data is becoming more pervasive, but tools like Hadoop, which is an open-source framework for storing large scale data, have developed so much in the last ten years or so. These type of tools underpin data processing for some of the worlds largest businesses with the most amount of data to store. It can take care of unstructured data way faster and way cheaper than the generation of tools that came before them.
3. Not only is it now cheaper to process all of this data, but companies are getting usable information out of it
Retail stores are really taking advantage of big data right now. You might ask what they are getting from it. Good question. They are using customer loyalty cards to gather information. This is a really good way to figure out what they sell and what to stock. Like for example, they might not stock a whole lot of low cost generic food brands... But if they see that their customers who spend the most at their store tend to buy it, then guess what they are going to stock more of? Low cost generic foods.
Loyalty cards and even debit cards really give a huge insight to each individual customer. There was a huge story about this last year actually. A Target customer got really mad when they sent his young daughter coupons to buy baby clothes. Because of the things she was buying, when they processed the data, it seemed like she might be expecting a baby soon. It seems like kind of a mess up on their part, but it's actually really smart overall.
4. Big data analytics could lead to productivity gains in four sectors
If big data analytics go mainstream then retail chains and manufacturing companies could see a huge increase of $325 billion to their annual GDP thanks to increased efficiency. Healthcare and other government services could also see a ton of productivity gains as well, as much as $285 billion in the next five years.
5. Big data analytics saw a huge increase in venture capitalist funding in the last 12 months
Venture capitalists have started to check out the possibilities with big data analytics as well. In the last year alone, they have invested about $1.37 billion into different companies. This is an increase of 217% above the last period. This is because big data is now enterprise ready, analytics tools are open to pretty much anyone and can be used very easily, and all of the analyzing can be done in real time.
Content originally published here
Big data started out as a way to store a ton of information, but now it has transformed into something where companies are trying to use it in any way they can to benefit from it in different ways. So here is a list of a few reasons why this is happening.
1. Unstructured data is now readily available
Unlike traditional business insight that analyzes structured data, big data focuses on the unstructured data. This includes emails, videos, photos and posts on social networking sites. Millions of photos get uploaded to Facebook, millions of tweets go out every day, and businesses can use this information to understand their customers better. This helps with suggested sales and such.
Looking for a Quote on an Laptop Rental for your next event or conference? Check out Rentacomputer's huge line of laptop rentals from all the top brands!
2. It has become incredibly cheap to store huge amounts of data
Unstructured data is becoming more pervasive, but tools like Hadoop, which is an open-source framework for storing large scale data, have developed so much in the last ten years or so. These type of tools underpin data processing for some of the worlds largest businesses with the most amount of data to store. It can take care of unstructured data way faster and way cheaper than the generation of tools that came before them.
3. Not only is it now cheaper to process all of this data, but companies are getting usable information out of it
Retail stores are really taking advantage of big data right now. You might ask what they are getting from it. Good question. They are using customer loyalty cards to gather information. This is a really good way to figure out what they sell and what to stock. Like for example, they might not stock a whole lot of low cost generic food brands... But if they see that their customers who spend the most at their store tend to buy it, then guess what they are going to stock more of? Low cost generic foods.
Loyalty cards and even debit cards really give a huge insight to each individual customer. There was a huge story about this last year actually. A Target customer got really mad when they sent his young daughter coupons to buy baby clothes. Because of the things she was buying, when they processed the data, it seemed like she might be expecting a baby soon. It seems like kind of a mess up on their part, but it's actually really smart overall.
4. Big data analytics could lead to productivity gains in four sectors
If big data analytics go mainstream then retail chains and manufacturing companies could see a huge increase of $325 billion to their annual GDP thanks to increased efficiency. Healthcare and other government services could also see a ton of productivity gains as well, as much as $285 billion in the next five years.
5. Big data analytics saw a huge increase in venture capitalist funding in the last 12 months
Venture capitalists have started to check out the possibilities with big data analytics as well. In the last year alone, they have invested about $1.37 billion into different companies. This is an increase of 217% above the last period. This is because big data is now enterprise ready, analytics tools are open to pretty much anyone and can be used very easily, and all of the analyzing can be done in real time.
Content originally published here
Wednesday, May 20, 2015
Microsoft Reveals Every Version of Windows 10
Windows has not yet decided to give us an exact release date for Windows 10 except the fact that it will be some time in the summer. But, they are revealing all of the different versions of the operating system that will be available when it is done. They are calling Windows 10 Home the "consumer-focused desktop edition". This is the one that is going to come installed on most of the home PCs and laptops, and it is going to have pretty much all of the neat features, like the new Edge browser, Hello face recognition, and all of the built-in universal apps. There will also be a "Pro" edition like the versions before that will come with a bunch of business software and the ability to connect to domains and take advantage of the business updates for Windows.
With the release of Windows 10 across all the different platforms like PCs, smartphones, tablets, and the Xbox One, Microsoft is also renaming the Windows Phone. The new name is officially "Windows 10 Mobile". It is going to have touch optimized versions of Office and support for a new Continuum for Phone feature. This is going to support phones and small tablets that are less than 8 inches in size. There is also going to be a version of Windows 10 Mobile called the Enterprise edition that will be designed for big businesses to license the operating system on smartphones and small tablets.
Those are the main editions that are going to be coming out, but on top of them there will be a few others like Windows 10 Enterprise, Windows 10 Education, and Windows 10 IoT Core for smaller gateway devices. In total there are 7 different editions of the operating system for a bunch of different devices. Since this is going to be the last version of Windows and they are just going to build on it from here on out, we can also expect to see huge updates and different additions some time around fall of this year. And from there on out, who knows what kind of sweet updates we will see.
Content originally published here
With the release of Windows 10 across all the different platforms like PCs, smartphones, tablets, and the Xbox One, Microsoft is also renaming the Windows Phone. The new name is officially "Windows 10 Mobile". It is going to have touch optimized versions of Office and support for a new Continuum for Phone feature. This is going to support phones and small tablets that are less than 8 inches in size. There is also going to be a version of Windows 10 Mobile called the Enterprise edition that will be designed for big businesses to license the operating system on smartphones and small tablets.
Those are the main editions that are going to be coming out, but on top of them there will be a few others like Windows 10 Enterprise, Windows 10 Education, and Windows 10 IoT Core for smaller gateway devices. In total there are 7 different editions of the operating system for a bunch of different devices. Since this is going to be the last version of Windows and they are just going to build on it from here on out, we can also expect to see huge updates and different additions some time around fall of this year. And from there on out, who knows what kind of sweet updates we will see.
Do you need a Rental Quote for a Computer Rental? Rentacomputer.com is the #1 computer rental provider in the United States, offering local, professional delivery and installation anywhere in the country!
Content originally published here
Tuesday, May 12, 2015
New Ways to Store Big Data on Azure from Microsoft
Azure is going to have a new data warehouse service. It's called a "data lake" service that is going to store a huge amount of data and give the option of running an "elastic" database that can store sets of data that vary in size. Scot Guthrie, Microsoft Executive Vice President of the cloud and enterprise group, unveiled these new services at the companies Build 2015 conference in San Francisco.
The Azure SQL Date Warehouse is going to be up and running later this year and is going to give companies a way to store petabytes of data. This will allow the data to be easily consumed by data analyzing software like Microsoft's Power BI tool for data visualization, the Azure Data Factory for data orchestration, or the Azure Machine Learning service.
One thing that makes this data storage service different than the rest is that it has the ability to adjust to fit the amount of data that actually needs to be stored. You can also specify exactly how much processing power you need to be able to analyze the data. The service builds on the parallel processing architecture that Microsoft developed for its SQL Server database.
This new cloud service is made for companies and organizations that need to store massive amounts of data so that it can be analyzed by different analysis platforms like Hadoop. It could also be super useful for Internet of Things systems that might create huge amounts of data. The amount of data you"ll be able to store is absolutely endless. So you can see how this would be helpful. There is literally no limit.
The company also updated the Azure SQL database service so that customers can pool their Azure database and reduce their storage cost and prepare for new activity. This means that you can manage your storage at a lower cost.
All of this is going to be very useful for running public-facing software services where the amount of space used can fluctuate a whole lot day to day. With most services like this, you'll generally pay for your peak storage space no matter how much of it you are using at the time. This means that you can cut your costs, probably in half, and literally only pay for exactly how much you are using at any given time.
Content originally published here
The Azure SQL Date Warehouse is going to be up and running later this year and is going to give companies a way to store petabytes of data. This will allow the data to be easily consumed by data analyzing software like Microsoft's Power BI tool for data visualization, the Azure Data Factory for data orchestration, or the Azure Machine Learning service.
One thing that makes this data storage service different than the rest is that it has the ability to adjust to fit the amount of data that actually needs to be stored. You can also specify exactly how much processing power you need to be able to analyze the data. The service builds on the parallel processing architecture that Microsoft developed for its SQL Server database.
This new cloud service is made for companies and organizations that need to store massive amounts of data so that it can be analyzed by different analysis platforms like Hadoop. It could also be super useful for Internet of Things systems that might create huge amounts of data. The amount of data you"ll be able to store is absolutely endless. So you can see how this would be helpful. There is literally no limit.
The company also updated the Azure SQL database service so that customers can pool their Azure database and reduce their storage cost and prepare for new activity. This means that you can manage your storage at a lower cost.
All of this is going to be very useful for running public-facing software services where the amount of space used can fluctuate a whole lot day to day. With most services like this, you'll generally pay for your peak storage space no matter how much of it you are using at the time. This means that you can cut your costs, probably in half, and literally only pay for exactly how much you are using at any given time.
Content originally published here
Monday, May 11, 2015
Microsoft Will End Support for Windows Media Center with Release of Windows 10
Windows 10 is coming out this summer, but it will not run Windows Media Center, Microsoft's major software used for PCs. If you've already seen the early stages of Windows 10 then you would have noticed in the preview versions of the system that Media Center wasn't compatible with it. "We can confirm that due to decreased usage, Windows Media Center will not be part of Windows 10," a Microsoft spokesperson told PCWorld via email. Media Center hasn't had any significant updates in almost 6 years and in Windows 7 and Windows 8 it was nearly the same, so ending it shouldn't come as much of a surprise to anyone.
With the release of Windows 10, Microsoft has decided to drift away from the whole idea of having your computer run everything in your living room and turning it into an entertainment hub. The Idea of having your computer connected to everything in your house was pretty cool and it would certainly be convenient. The problem with it though is that it would be a huge pain in the butt to set everything up, and the possibility of something going wrong is very high. But while everything was working, it would be super cool.
The idea, though, never really caught on. It doesn't make sense for most people these days, and it is too much to worry about. With smart TVs and smartphones, you pretty much have access to any TV program or movie ever. Having a PC dedicated to the living room just isn't worth it. It isn't 1998 anymore.
There are still plans to put software in the living room from Microsoft. The plans just don't involve a computer anymore. The company offers a Miracast-powered TV dongle called the Wireless Display Adapter, which lets you project your phone's screen onto your TV. This makes more sense to me than all of the other complicated stuff and having a dedicated living room PC just for entertainment.
Another A/V type is the Xbox One that offers different TV-centric features like digital TV tuner capabilities and multiple apps like Netflix and Sling TV. Of course a PC is much more customizable, but it's not everyone's first choice. If the new Steam Machines from Valve become more popular, they could be used for HTCP purposes on top of their PC gaming functions.
If you are running an HTCP with Windows that relies on Windows Media Center, don't bother upgrading it to Windows 10. If you have no choice but to upgrade your current HTCP, then just go with a Windows 7 or Windows 8.1 PC. If you don't really care for losing Media Center, then you could go with the Plex Media Server or XBMX, which works with Windows and Linux, as well as Raspberry Pi.
Content originally published here
With the release of Windows 10, Microsoft has decided to drift away from the whole idea of having your computer run everything in your living room and turning it into an entertainment hub. The Idea of having your computer connected to everything in your house was pretty cool and it would certainly be convenient. The problem with it though is that it would be a huge pain in the butt to set everything up, and the possibility of something going wrong is very high. But while everything was working, it would be super cool.
The idea, though, never really caught on. It doesn't make sense for most people these days, and it is too much to worry about. With smart TVs and smartphones, you pretty much have access to any TV program or movie ever. Having a PC dedicated to the living room just isn't worth it. It isn't 1998 anymore.
There are still plans to put software in the living room from Microsoft. The plans just don't involve a computer anymore. The company offers a Miracast-powered TV dongle called the Wireless Display Adapter, which lets you project your phone's screen onto your TV. This makes more sense to me than all of the other complicated stuff and having a dedicated living room PC just for entertainment.
Another A/V type is the Xbox One that offers different TV-centric features like digital TV tuner capabilities and multiple apps like Netflix and Sling TV. Of course a PC is much more customizable, but it's not everyone's first choice. If the new Steam Machines from Valve become more popular, they could be used for HTCP purposes on top of their PC gaming functions.
If you are running an HTCP with Windows that relies on Windows Media Center, don't bother upgrading it to Windows 10. If you have no choice but to upgrade your current HTCP, then just go with a Windows 7 or Windows 8.1 PC. If you don't really care for losing Media Center, then you could go with the Plex Media Server or XBMX, which works with Windows and Linux, as well as Raspberry Pi.
Content originally published here
Monday, April 27, 2015
Google Looking to Increase Internet Speeds with QUIC Network
Google has been stress-testing its QUIC (Quick UPD Internet Connections) network protocol the past quarter. Right now, about half of requests from Chrome to Google servers are being set through QUIC and the company says it's leading to a greater performance improvement over TCP resulting in an increasing amount of traffic.
QUIC supports multiplexed transport over UDP while also giving security comparable to TLS/SSL. It is also cutting down on latency by relying on UDP instead of TCP. As of now the only real concern with latency-sensitive services like Google Search is how the UDP-based QUIC outperforms TCP by establishing connections with servers its already communicated with, without any hassle or extra round trips to the server. QUIC gets the better part of TCP in poor network conditions.
“The standard way to do secure web browsing involves communicating over TCP + TLS, which requires 2 to 3 round trips with a server to establish a secure connection before the browser can request the actual web page,” Google wrote in a blog post Friday.
When re-transmitting a packet, packet sequence numbers are never reused. This helps figure out which packets have been received as well as not having to worry about your internet timing out. QUIC over TCP has helped save a second off of the Google Search page load time for the slowest 1% of connections. This helps sites like YouTube, for watching videos online. Viewers have reported 30% fewer rebuffers using QUIC to watch videos.
Google also wants to ask if the Internet Engineering Task Force (IETF), with some changes like the wire format and dissociate the current plan from SPDY and moving to QUIC-over-HTTP2, will make the protocol an Internet standard. Google wants to continue increasing traffic over the protocol, with the overall outcome of entrusting it with all the traffic from Google clients to Google services.
Google is expecting good things to come from this so we'll see how it all plays out in the end.
Content originally published here
QUIC supports multiplexed transport over UDP while also giving security comparable to TLS/SSL. It is also cutting down on latency by relying on UDP instead of TCP. As of now the only real concern with latency-sensitive services like Google Search is how the UDP-based QUIC outperforms TCP by establishing connections with servers its already communicated with, without any hassle or extra round trips to the server. QUIC gets the better part of TCP in poor network conditions.
“The standard way to do secure web browsing involves communicating over TCP + TLS, which requires 2 to 3 round trips with a server to establish a secure connection before the browser can request the actual web page,” Google wrote in a blog post Friday.
When re-transmitting a packet, packet sequence numbers are never reused. This helps figure out which packets have been received as well as not having to worry about your internet timing out. QUIC over TCP has helped save a second off of the Google Search page load time for the slowest 1% of connections. This helps sites like YouTube, for watching videos online. Viewers have reported 30% fewer rebuffers using QUIC to watch videos.
Google also wants to ask if the Internet Engineering Task Force (IETF), with some changes like the wire format and dissociate the current plan from SPDY and moving to QUIC-over-HTTP2, will make the protocol an Internet standard. Google wants to continue increasing traffic over the protocol, with the overall outcome of entrusting it with all the traffic from Google clients to Google services.
Google is expecting good things to come from this so we'll see how it all plays out in the end.
Content originally published here
Subscribe to:
Posts (Atom)