Microsoft OneDrive Microsoft OneDrive has become one of the most useful tools in the Microsoft 365 suite. It is being used by more companies every day. Between the robust feature set and the constant updates, it is easily on par with other cloud storage solutions. It is offered as part of every Office 365 plan. So for any Office 365 user, there is no reason not to use it. Originally rolled out under the name SkyDrive in 2007, legal issues led Microsoft to settle for the name OneDrive instead. Just like many other MS products, two version of OneDrive are offered, a consumer edition and an enterprise edition. Both versions are very similar but have some key differences. The enterprise edition is much better for businesses. The most important difference is the ability to centrally manage the entire organization’s OneDrive. Over the years since its release, Microsoft has been constantly adding new features. So what do people like most about OneDrive? Local Sync folder When OneDrive is installed, it creates a OneDrive folder on your computer. This folder acts like a regular file folder. It looks just like any folder you would find in your favorites bar, like the documents or downloads folders. This folder also syncs all of its contents to OneDrive’s cloud storage. So not only is it easy to use, it is also accessible from any computer. You just need to log in. With 1TB free with all Office 365 subscriptions (5GB free for regular users), most users will be able to fit most, if not all, of their files in the singular folder. OneDrive also supports multiple folders within the OneDrive directory. This allows you to keep everything organized the way you like it. Files on demand To keep local storage usage low, files are moved up into OneDrive until you use them. Files are not automatically kept on your local drive unless you specifically choose a file or folder to “always keep on this device”. You can still see and browse to all of your files. They download immediately when opened. You can switch between keeping files on your local drive and keeping them in OneDrive just by right clicking the file or folder and choosing the correct option. No need to worry about needing files when you are offline, as long as you are prepared. Sync existing folders A recent addition to OneDrive is the ability to sync folders other than the OneDrive folder. The most useful folders being the my documents and pictures folders. This is an easy way to backup valuable pictures and documents that don’t necessarily fit into your organization of the OneDrive folder. You can even sync these folders between computers. I use it to sync my desktop backgrounds across all computers. If I find an awesome picture that would make a great background, I just save it into my backgrounds folder. It syncs to OneDrive and automatically adds to my desktop background slideshow on both my work computer and personal computer. Sharing files In both editions of OneDrive, sharing files is as easy as right clicking a file and clicking sharing. There are multiple options when sharing. This includes a read-only version or an editable version, a password protected version, and a version that is only usable by a single person. You can set all sorts of permissions, especially in the enterprise version. Access controls are immensely important in the business world when sharing sensitive information. Even better, these access controls can be controlled by an admin. This gives businesses more control over who sees your data and how. Creating shared folders One of the easiest ways to share files is to create a shared folder that multiple people can access. Whether you want to create a folder for a single team, a whole department, or even the whole company, the process is fairly easy. One creates the folder and adds the correct names to the list of users. All users will have access to the folder and all files within, with varying levels of permissions. You may want some users to only have read access, while others get write access. Linked content OneDrive gives you the ability to create a link to a file and send that to someone else to access it. This is particularly useful when trying to email files to someone else, especially someone outside of your organization. Not only is this convenient, it adds another layer of security to emailing files. It saves space in everyone mailbox by eliminating attaching large files. OneDrive has come a long way in the last few years. Once it was overshadowed by other cloud storage services like Dropbox or Box. Now with its integration with Microsoft 365 and robust security features, it is easily one of the leaders in the space. It is clear that this is a core application in the Microsoft Office suite. I think you will find its an extremely useful tool in the business world. Looking to migrate to Microsoft 365 and see the advantages of OneDrive? Contact ProCern for more information.
Do your Linux servers use LVM? If not, you should strongly consider it. Unless, you are using ZFS, BTRFS, or other “controversial” filesystems. ZFS and BTRFS are outside of scope for this discussion but are definitely worth reviewing if you haven’t heard of them and are running Linux in your environment. Logical Volume Manager for Linux is a proven storage technology created in 1998. It offers layers of abstraction between your storage devices in your Linux system and the filesystems that live on them. Why would you want to add an extra layer between your servers and their storage you might ask? Here are some reasons: Flexibility You can add more physical storage devices if needed, and present them as a single filesystem. Online maintenance – Need to grow or shrink your filesystems, online, and in real-time? This is possible with LVM. It is possible to live migrate your data to new storage. Thin provisioning can be done, which can allow you to over-commit your storage if you really want to. Device naming – you can name your devices something that makes sense instead of whatever name Linux gives the device. Meaningful device names like Data, App, or DB are easier to understand than SDA, SDB, SDC This also has the benefit of reducing mistakes when working with block devices directly. Performance – it is possible to stripe your disks and improve performance. Redundancy – it is also possible to add fault tolerance to ensure data availability. Snapshots This is one of my favorite reasons for using LVM. You can take point-in-time snapshots of your system Those snapshots can then be copied off somewhere else. It is also possible to mount the snapshots and manipulate the data more granularly. Want to do something risky on your system, and if it doesn’t work out, have a quick rollback path? LVM is perfect for this. So how does it work? According to Red Hat :“Logical Volume Management (LVM) presents a simple logical view of underlying physical storage space, such as hard drives or LUNs. Partitions on physical storage are represented as physical volumes that can be grouped together into volume groups. Each volume group can be divided into multiple logical volumes, each of which is analogous to a standard disk partition. Therefore, LVM logical volumes function as partitions that can span multiple physical disks.” I think LVM is much easier to understand with a diagram. The above image illustrates some of the concepts involved with LVM. Physical storage devices recognized by the system can be presented as PVs (Physical Volumes). These PVs can either be the entire raw disk, or partitions, as illustrated above. A VG (Volume Group) is composed of one or more PVs. This is a storage pool, and it is possible to expand it by adding more PVs. It is even possible to mix and match storage technologies within a VG. The VG can then allocate LVs (Logical Volumes) from the pool of storage, which is seen as raw devices. These devices would then get formatted with the file system of your choice. They can grow or shrink as needed, so long as space is available in either direction for the operation. You really should be using LVM on your Linux servers. Without LVM, many of these operations discussed above are typically offline, risky, and painful. These all amount to downtime, which we in IT like to avoid. While some may argue that the additional abstractions add unnecessary complexity, I would argue that LVM really isn’t that complicated once you get to know it. The value of using LVM greatly outweighs the complexity in my opinion. The value proposition is even greater when using LVM on physical Linux nodes using local storage. SAN storage and virtual environments in hypervisors typically have snapshot capabilities built-in, but even those do not offer all of the benefits of LVM. It also offers another layer of protection in those instances. Alternatively, the aforementioned ZFS and BTRFS are possible alternatives, and arguably better choices depending on who you ask. However, due to the licensing (ZFS) and potential stability (BTRFS) issues, careful consideration is needed with those technologies. Perhaps those considerations are topics for a future blog… Want to learn more? Please reach out, we’re here to help.
As the digital economy expands, the volume of data generated by businesses continues to grow exponentially. In fact, by 2025, global data creation is projected to grow to more than 180 zettabytes, up from 46.2 in 2020. This surge has pushed the boundaries of traditional data storage systems to their limits, warranting more advanced solutions. Artificial Intelligence (AI) has emerged as a revolutionary force in transforming how data is managed, analyzed, and stored, offering efficiencies that are critical in an era of such immense data growth. AI’s Role in Data Storage Innovation to Unlock New Efficiencies AI-driven predictive analytics: One of the most profound impacts of AI on data storage is its predictive analytics capabilities, with the ability to analyze patterns within vast data sets to predict and preemptively address potential system failures. This proactive approach minimizes downtime and improves data availability, which is crucial for businesses where data accessibility directly impacts operational efficiency. Additionally, predictive analytics extends the lifespan of hardware by identifying issues before they escalate, which can cut costs related to both maintenance and replacement. Enhanced data management and automation: AI streamlines complex data management tasks that traditionally require manual intervention. Automated data tiering and load balancing optimize storage resources in real-time, ensuring data is stored efficiently based on usage and value. Meanwhile, AI-enhanced snapshot management automatically creates and manages backup snapshots according to data’s criticality and usage patterns, enhancing data integrity and improving recovery times. This results in lower operational overhead and increased overall system efficiency, providing substantial cost savings and operational agility. Better security protocols: Through continuous learning, AI models can detect unusual patterns that may signify a security breach, such as ransomware attacks or unauthorized access. Once detected, AI-driven systems can initiate automatic responses to isolate threats and prevent spread, reducing the window of vulnerability. Real-time data processing: By integrating AI into storage systems, your data can be analyzed and processed at the point of storage, reducing latency and accelerating the decision-making process. This is particularly useful in industries like finance and healthcare, where real-time data analysis can provide a competitive advantage and improve patient outcomes. Energy efficiency and sustainability: AI can intelligently manage storage systems’ power consumption based on the workload to reduce unnecessary energy use, allowing you to reduce your carbon footprint and significantly lower your energy costs. Scalability and flexibility: As your business grows and your data needs inevitably evolve, AI-driven storage systems can dynamically scale up or down to meet these demands without service interruptions. AI systems can automatically adjust storage capacity and performance parameters in real time, ensuring your enterprise has the necessary resources whenever they are needed. With the ability to preemptively manage resource allocation, these AI capabilities ensure efficient utilization of storage resources, helping you avoid over-provisioning and underutilizing to optimize both cost and performance. At ProCern, we understand that the efficiency and reliability of your storage solutions can impact everything from your operational agility to your ability to adapt and grow in a competitive market. With Hewlett Packard Enterprise Alletra, powered by AI, we’ll help you establish a future-proof infrastructure that not only adapts to rapid technological changes but also scales to seamlessly meet your growing data needs.
Future-Proof Your Data Storage In business, nothing stays the same. Regulations change, new security threats appear overnight, and data continues to expand at an exponential pace. Today there are over 147 zettabytes of data in the entire digital universe, up from 64.2 zettabytes in 2020.* This trend of explosive data growth is expected to become even more pronounced moving forward. Your data supports decision-making, drives innovation, and allows you to out-beat your competition. If you’re simply keeping up, you’re already falling behind. The only way to survive this relentless surge of data and environment of constant change is to implement strategies that protect your data and prepare it for unpredictable yet inevitable shifts. Here are five emerging strategies you can adopt to future-proof your data and maintain a resilient storage architecture: Automate compliance using AI: Modern AI tools can automatically manage and enforce compliance across your data storage systems, seamlessly adapting to legal changes. By automating these processes, you not only protect the integrity of your data, but also build a resilient framework that can quickly adjust to new regulations—giving you peace of mind that you’re compliant with the latest standards as they evolve. Employ blockchain technology to verify your data: Blockchain technology provides an unbreakable chain of custody for your data—ensuring that each piece of information is part of a secure and immutable ledger. This innovative approach not only protects your data, but also makes it traceable, streamlining audits and improving transparency. This level of security will become more critical as regulations around data privacy and transactions become more complex. Encrypt your data for stronger security: By using advanced encryption methods, you add another layer of protection to keep unwanted eyes off your data. Even if attackers breach other defenses, your information remains unreadable, ensuring that sensitive data stays protected against threats that haven’t even been conceived yet. Use object storage: With the ability to handle massive amounts of unstructured data, object storage makes it easy to scale your storage needs as your business grows. By organizing your data into discrete units, you can access your information anytime, anywhere, without worrying about the limitations of traditional storage methods. Object storage also features built-in redundancy, which means your data will remain safe and accessible, even in the face of hardware failures. Implement data tiering and cold storage solutions: By strategically moving less frequently accessed data to cost-effective cold storage options, you ensure that critical data remains readily available on faster storage while archiving less active data securely. As your data demands grow, this approach not only helps you scale with ease, but also better positions you to adapt to changing technology needs without overhauling your entire storage strategy. Streamline your Storage Strategy Adopting proactive, future-ready data management strategies isn’t just about protecting your data. It’s about empowering your organization to thrive in a data-driven future. This ensures that you’re always one step ahead of technological shifts, market demands, and your competition. With HPE Alletra MP, a flexible and high-performance multi-protocol storage platform, ProCern can help you streamline your storage strategy. This ensures that your data operations are secure, scalable, and ready to meet future demands. For more information about how to future-proof your data strategy using HPE Alletra MP, contact us here. * https://explodingtopics.com/blog/big-data-stats