Dark data. Sounds a bit sinister, doesn’t it? Let’s unravel the mystery of dark data and talk about the complex issues that must be considered and the ramifications of storing or not storing that data for businesses and organizations.What is Dark Data and Should You Bother to Store It?
Wikipedia defines dark data as data “acquired through various computer network operations but not used in any manner to derive insights or for decision making.”
The volume and rate of collecting data can easily exceed the capability of most organizations to properly tag, store and analyse that data. Not surprisingly, given how difficult it can be to identify the wheat from the chaff, it has become a common practice to store all the data that is generated.
With over 2.5 quintillion bytes of data created every single day, and an estimated 1.7MB of data created every second for every person on earth by the next year, this becomes an increasingly pressing issue for storage architects and IT departments.
Storing all that data creates everything from compliance issues to overburdened storage systems and also raises the possibility of ransomware threats, but deleting it has the potential to cause even more problems. What if you accidentally delete something you need or that might prove to be invaluable later on?
So, most organizations continue to add more storage as they accrue data, and much of that data is unstructured data.What is Unstructured Data?
You could say that enabling organizations to cost-effectively store unstructured data is our business. After all, we’ve been doing that since our first product release in 2006.
Unstructured data is quite abundant in today’s IT landscape. It can be just about anything, from music recordings to medical imaging to video footage. The defining characteristic of unstructured data is that it is not stored in a structured, predefined format. That makes it challenging not just to store, but also to manage.How do you Store and Manage Unstructured Data?
Over the years, the Caringo team has helped numerous customers store, organize and access massive amounts of unstructured data with our Swarm Object Storage Software. Check out our case studies for detail about how we helped organizations like the STFC Scientific Computing Department, Texas Tech University and NEP in the Netherlands.When you tier data into Swarm object-based data storage, you benefit from continuous built-in data protection, management, organization and search at massive scale. As the pioneer in object storage technology, Caringo products have some distinct differences in methodology that give our customers a significant advantage. While we cannot cover them all in just one blog, part of the Swarm difference revolves around our integration of Elasticsearch and how we store and use metadata (a.k.a., data about the data).
To learn more, watch our Tech Tuesday webinar about using metadata with object storage on demand or read the summary that follows the webinar.How Does Caringo Use Metadata & Elasticsearch to Illuminate Dark Data?
Metadata and Elasticsearch are the key to making data easy to find in Swarm Object Storage. This is a topic we have addressed in a number of blogs and webinars, including our most recent Tech Tuesday webinar (using elasticsearch with object storage).
Using Swarm’s extensive custom metadata capabilities and Elasticsearch simplifies the task of locating discrete types of data in a large data store. It gives you dynamic organization of content with classification, key words, descriptive content and multiple methods to track content with no separate big data project required.
Once you have data in a Swarm Object Storage cluster, the content of it and the value of it are illuminated, so you can reap insights and potentially realize new ways to monetize your data.Get a Custom Demo or Ask our Experts
If you have questions or would like to request a customized demo to explore the use of Swarm Object Storage for your business or organization, contact us. We are ready to help!
As we head into the heart of summer, it reminds me of how quickly time flies and how rapidly technology changes. With a mature product such as Swarm, new features are added regularly to keep up with the evolving needs of the marketplace and our customers. In addition, we consistently review the value of existing features to ensure they not only meet current needs but anticipate future needs as well.
Two short years ago, my colleague Jamshid “Jam” Afshar blogged on how Elasticsearch & Object Storage solves petabyte-scale search as he prepared to discuss the topic in a webinar. Next week, Jam will be joining me on our monthly Tech Tuesday webinar to discuss using Elasticsearch with Object Storage.What is Elasticsearch and Why Should I Use it?
Elasticsearch is a distributed search and analytics engine that offers a RESTful API which can be used with object storage to enhance metadata searching operations.
In Swarm Object Storage, Elasticsearch provides the ability to list and query objects based on their metadata information. This is a key capability needed to bring structure to a large pool of unstructured data. (If you want to learn more about metadata with object storage, I highly recommend you watch our Using Metadata with Object Storage webinar.Why Does Swarm Object Storage Use Elasticsearch?
At Caringo, we were early adopters of Elasticsearch (going as far back as Elasticsearch version 0.90) because we needed a scalable solution to solve the problem of listing objects in a Swarm cluster. At the time, we evaluated NoSQL approaches including Solr, Elasticsearch and MongoDB in addition to traditional SQL database offerings (noting that traditional SQL databases lacked necessary scale and still do). We found that Elasticsearch was by far the most promising solution. Specifically, it passed our rigorous testing standards for speed of writes/updates and searches.
Additionally, Elasticsearch included an extensive API for management and diagnostics in an Elasticsearch cluster. Fulfilling the promise that we saw in the infancy of Elasticsearch, the technology has grown in popularity and reach with many large Elasticsearch deployments in production to date.How Does Elasticsearch Provide Structure to “Big Data?”
Swarm Object Storage software is fully integrated with Elasticsearch. This is implemented in the form of a “search feed” which populates the Elasticsearch cluster with the metadata information present on the stored objects. This information is effectively cached in the Elasticsearch cluster for fast list and query operations.
Furthermore, the Swarm API itself is extended to allow for list and query of Swarm objects in terms of their metadata. This results in the ability for Swarm to index object metadata in near real time, enabling you to perform ad hoc searches on the metadata attributes of stored objects. With the Swarm Content Portal, we take things further by providing a web UI which allows you to easily save frequently used queries as Collections. These Collections can be presented as virtual folders which will always return the latest set of objects that meet the criteria for the query.
Note that although Swarm Software is the ultimate authority for the metadata information of all objects stored, it’s still a best practice to take a snapshot of your Elasticsearch index. This allows for decreased time to recovery in the event of an unanticipated failure and allows you to quickly return list and query capability to clients and applications which depend on it.Ready to Learn More?
Register today for our July 16 webinar on Elasticsearch where Jam and I will:
- Explain what Elasticsearch is and the benefit of using it with object storage
- Take an in-depth look at best practices for using Elasticsearch with object storage
- Demonstrate the use of Elasticsearch with Caringo Swarm
The post Unleashing the Power of Object Metadata with Elasticsearch appeared first on Caringo.Related posts:
Independence Day (a.k.a, the Fourth of July) in the USA commemorates the Declaration of Independence (July 4, 1776), where the Continental Congress declared the American colonies as united, free, independent states. As we celebrate with fireworks, patriotic music and BBQ, the rest of the world goes on with business as usual. And, in today’s world, business means data—and lots of it!How Much Data is Being Stored?
What does this mean? You guessed it. The need for cost-effective, highly scalable storage with built-in data protection will only continue to grow.How is Data Stored?
Data is stored in a wide variety of solutions. From primary block- and file-based storage devices such as:
- SAN (Storage Area Network)
- NAS (Network-Attached Storage)
- DAS (Direct-Attached Storage)
To what is generally considered secondary storage:
- Cloud Storage (all of which is based on Object Storage technology)
- On-Prem Object-based Storage
- Tape Storage
If you want to learn more about the various types of data storage, I recommend you watch our Back to Basics webinar featuring CEO Tony Barbagallo and VP Marketing Adrian Herrera or read the blog What are the Differences Between Block, File and Object-Based Data Storage?Why Liberate Your Data?
Over the past years, we’ve talked a lot about data being locked in silos. It was one of the first blog topics I tackled when I started working for Caringo in 2015, inspired by Marc Staimer’s Ending Storage Silos whitepaper. The reasons for liberating data from silos and moving it into object storage remain the same:
- Improve your organization’s productivity with data portability between protocols (S3, SCSP, HTTP, HDFS and NFS)
- Expand search capabilities with metadata
- Lower the short- and long-term cost of storing data
- Support data storage and distribution at scale
- Increase resilience of data and simplify recovery
Register now for our August 20 Tech Tuesday webinar to learn about Data Resiliency & Recovery with Swarm Object Storage.
When your organization manages data effectively and can find data when it is needed, you have the freedom to focus on other aspects of business critical to your success. You can better collaborate, create, communicate and take care of your employees and customers.
When you free up staff hours by simplifying data management and dollars by reducing storage cost of acquisition and ownership (TCA and TCO), you create an environment ripe with possibility and innovation.
And, with Swarm Object Storage, you empower your Storage and IT Admin to unplug and find work-life balance.
Consolidating your data on our Swarm Object Storage Platform has never been easier. Check out our How to Migrate to Object Storage from SAN, NAS and Tape Storage resource page to learn just how simple we make it to liberate your data from silos and consolidate it on the Swarm Object Storage Platform for access, distribution and archive. If you need help, contact us. Our experts are happy to help.
How often do you go off the grid and really unplug? If you are a Storage or IT Admin, my guess is not often enough. The responsibility of maintaining a reliable and efficient storage environment for an organization is a heavy mantle. And, it becomes heavier as we increasingly rely on the data that is stored to run a business, create and/or deliver a product (for example, video content on streaming platforms).The Power of Unplugging
The power of unplugging from technology for people is well documented—in everything from lifestyle blogs to business articles and scientific research studies. We all know it is difficult in today’s competitive workplace; and let’s face it, some of us are workaholics who thrive by giving our careers 110%. But, we all know that we need to take a step back and go on vacation here and there, enjoy the holidays with our loved ones and indulge in a bit of “me” time.
However, the portability and convenience of mobile phones, tablets and laptops means we hardly ever leave them home or turn them off. These devices serve as a tether to so many important things in our life—family, friends, recreation and last, but certainly not least, our jobs. At Caringo, we may not necessarily all be good at unplugging, but we are all committed to making sure that our customers can unplug and not worry about the security of their data!4 Suggestions for Regaining Work-Life Balance for Storage and IT Admins
When it is literally your job to stay plugged in and you are on call around the clock, how do you unplug? While we cannot alleviate all the workplace concerns that might keep you up at night, we have a few suggestions that might help you rest easy about your data:
- Implement a storage environment with continuous, built-in data protection that is self-healing.
- Make sure your staff is properly trained.
- Have the right Support plan in place.
- Ensure that the health of your storage environment is monitored around the clock.
Object-based storage technology has some inherent benefits that make it valuable for many storage environments. As our VP of Sales Ben Canter mentioned in last week’s blog, Checklist for Evaluating Object Storage, that includes:
- Built-in data protection
- S3 compatibility
- Powerful metadata and search capabilities
With Caringo Swarm, we’ve enabled educational institutions such as Texas Tech University, been integrated into scientific research facilities such as the STFC Jasmine super-cluster in the UK, empowered the Media & Entertainment industry as they create and deliver a wide range of video and helped both private business and government organizations store everything from medical records to surveillance video.Who Can Help Me?
Remember those four suggestions to help you unplug above? Maybe they seem unattainable, but they aren’t. At Caringo, we can help you with each of those four items. Here’s how.1. Build your storage to have continuous data protection that is self-healing.
A best-of-breed object storage platform like Caringo Swarm will have continuous built-in data protection. With our market-hardened platform, you not only get that continuous data protection (detailed in the whitepaper Protecting Data with Caringo Swarm Object Storage), you get a storage cluster that is self-healing. The Swarm recovery process is automatic (other object storage products require a manual recovery process).
Hopefully you have a staff, but we know that for many small-to-medium businesses, one person shoulders the load for keeping IT functioning. That means 24x7x365, including vacations and holidays, you might be at least to some extent on call.
If you have a staff, providing them with the proper training is critical. That is why at Caringo we hold 3-day intensive Caringo Certified Training session for our customers to ensure they understand how best to use Swarm. Our training is conducted by our own engineering staff, all of whom have been involved in developing, installing and maintaining Swarm Object Storage. This enables our students to dive as deep as they want and to build lasting relationships with the most experienced object storage engineers in the industry. We share best practices gleaned from hundreds of object storage implementations, and we incorporate the feedback given to us in class into our technology and roadmap discussions.3. Have the right Support plan in place.
One of the complaints that analysts tell us they most often hear about technology products is that the Support is not adequate. At Caringo, we hear just the opposite. We make it our business to offer online self-serve knowledge resources and after hours emergency access to support for our product. We also offer Professional Services for when you need additional staffing resources or have complex storage changes that you want to undertake.4. Ensure that the health of your system is being monitored.
Whether you have internal or external resources monitoring the health of your storage system, it enables you to be proactive about detecting issues and adding capacity when needed. That is why we added Health Reporting to Swarm several years ago.Contact Us Today
If you need help getting your storage to the point where you feel like you can unplug and enjoy a vacation, contact us today. One of our object storage experts will be happy to discuss your use case to determine if Caringo Swarm Object Storage is the right choice for you. You can also visit our Getting Started page for more information.
The post How Caringo Swarm Object Storage Can Help You Unplug appeared first on Caringo.Related posts:
I love shopping for a new car. The new car smell is more addicting than just about anything else out there. But as my wife will attest, the process starts many months ahead of the actual purchase. I research extensively to make sure I am choosing the car that best fits my needs of today and the next few years. One key step in the process is developing the list of must haves (high safety ratings, space for a 6’ back seat passenger, heated seats) and want to have (navigation, heated steering wheel, good gas mileage). With my list in hand, I am able to eliminate options very quickly until I find the few that fit my needs the best.
In working with thousands of customers over the years, I have found when customers take a similar approach to storage solutions as I have with my car buying, they are guaranteed to leave happy. While we as vendors are experts on our solutions, only the customer can be experts on their “must haves” and “want to have.” For those evaluating object storage, I’ve put together a checklist along with some helpful tips and reference materials.
The most important question to start with is “what is the problem you are trying to solve?” Assuming the answer is that you need storage for your data, we recommend reviewing a few relevant articles to make your decision process smoother.Overview:
- Build or Buy?
- Storage & Data Management
- Data Management Interfaces
- Service Interfaces
- Unified Namespace
- Metadata Features
There are a lot of choices for storage today. It is likely that you have already determined that object storage is a good fit for your use case because of the many benefits found in best-of-breed object storage solutions such as Swarm:
- Built-in data protection
- S3 compatibility
- Powerful metadata and search capabilities
If you are not certain that your use case is a good fit for object storage, we recommend that you contact us to talk to a storage architect or check out some of our educational resources on the topic:
- Storage Switzerland eBook: NAS vs. Object—Which is Best for Your Data Center?
- Tech Tuesday Webinar: How Does Object Storage Fit Into Your Infrastructure?
- Tech Tuesday Webinar: What Your Storage Vendor Isn’t Telling You About S3
Evaluating Object Storage for Your Use Case
In 2006 when we launched our first product, it was a simple choice as the options were limited. You could use Caringo or you could go to EMC to purchase Centera. Both of these products evolved from the work of Caringo Co-Founder Paul Carpentier, recognized as the inventor of content addressable storage (CAS).
Through the years, more and more companies started to incorporate object-based storage into their offerings with different levels of success. In 2012, we identified 5 “must haves” for object storage. This included:
- Symmetric Architecture
- Data protection for any size file, any number of files and any capacity
- Instant access, platform neutrality, NO proprietary databases!
- Cloud storage enablement
- Entire stack provided by one company
Fast-forward to 2019, and now there are a lot of object storage vendors to choose from, and that makes the task far more challenging! You not only need to identify the “must haves,” you must determine if the product will work with your existing infrastructure, meet your immediate needs and then grow with your organization or business.
Last year, I talked with Senior Consultant John Bell about this very topic on the Tech Tuesday Webinar: Evaluating Object Storage Solutions. We discussed a number of points in depth. Based on that discussion, we have put together a checklist that can assist you in your evaluation.Checklist for Evaluating Object-Based Storage Solutions
There are significant differences between object storage platforms. Here are some of the features that you should look at as you narrow the field of products you will take the time to evaluate.
- Does the product have automated rapid recovery?
- How much of the storage hardware is utilized for actual content versus overhead?
- How is the metadata stored?
- What are the performance characteristics of the product, and what are the demands for your use case?
- What level of availability do you need for search, sharing or streaming?
- What is the minimum and maximum capacity?
You should determine if it makes more sense for you to build your system (using software-defined object storage) or buy your system (that is, have a turn-key solution where you buy an appliance with the software. There are pros and cons to each. Here are a few things to consider when making this decision:
- Appliance Approach
- “Turnkey” solution
- May not be as flexible as necessary to meet certain requirements
- Units of purchase and associated licensing may also be inflexible
- Software Defined
- Requires more work up front (e.g., hardware sizing and purchase, integration etc.)
- Highly flexible in meeting specific requirements
- Units of purchase and licensing are also very flexible (typically “on demand”)
Look at the storage and data management features and compare the products you are most interested in. Here is a list of features you will most likely want to investigate:
- Combination of UI and API
- “Single Pane of Glass” Web Management Portal
- Monitoring and Event Notification
- Automated Failover and Recovery
- Full Availability
- Capacity On Demand
- Volume Portability (This is a key feature not found in “object on file system” or similar solutions!)
How will you manage the data? I suggest you look for a system that offers you the following:
- “Browse and Query” (Content Portal and API)
- Flexible Protection Schemes
- Combination of Replication and Erasure Coding
- Protection Policy Range (global default to individual object)
- Usage Metering and Quota Support
- Identity Management Integration (Including support for multiple IDM stores)
- Access Control
- Management Delegation
What connections will you need between your Object Storage and your other storage devices or services? Make sure to investigate this thoroughly and outline your requirements clearly.
- Native API (RESTful API based on standard HTTP 1.1)
- S3 (Ideally, a superset of what is found in Amazon S3)
- NAS (NFS & SMB)
- Service Connectors (Public Cloud)
While this feature is often overlooked, it is quite important if you want your Object Storage to function efficiently. For a true Unified Namespace, the storage must have:
- Ability to reference the object stored by the same name…
- Independent of how it was created
- Regardless of how it’s being requested (S3, NFS/SMB, Native API etc.)
- Specifically, names that are “human readable”
- Can be done with UUIDs, but this isn’t user friendly
- Allows for alignment of naming conventions across multiple protocols
- Provides automatic synchronization of name changes
Make sure to look at how the storage system manages metadata, as it is key for keeping your data searchable and accessible.
- Should include standard metadata support for system management and basic object query
- Ideally includes comprehensive support for custom metadata
- “Unlimited” custom metadata
- Ability to list and query on custom metadata
- Collections (saved queries) are a powerful tool for dynamic data/object sets (Ability to surface Collections through multiple access protocols is highly desirable.)
- Metadata should be easily managed and protected by the storage itself!
- No separate metadata servers
- No specialized controller nodes
As you examine products, make sure that you will have the right level of support in place, along with the professional services and training you need.
- Commercial Support vs. “Do It Yourself”
- Portals for software access and knowledge base
- Outsourced monitoring and notification
- Professional Services
- Requirements gathering
- Training (on-site, online etc., including Certification)
If you have questions or want to discuss how to get started with object storage, contact us. My team and I are ready to help.
“Data is the new oil.”
—Clive Humby, Mathematician
Businesses lose billions of dollars a year to IT downtime, and the actual loss of digital files and data can be even more disastrous. Today’s business models—across all types of verticals and use cases—are all in some way powered by data. Take the breadth of data that we are collecting. Then, combine that with the multiple formats we store data in. Add to that equation the plethora of storage technologies that have been in use over the past century. Got the picture? Yeah, it’s a bit messy.How Do You Refine Data?
Much like crude oil, raw data isn’t necessarily useful. So, just how do you refine your data stores? Much like oil, data needs to go to a refinery. So, the first step is to contain data in a store that gives you the functionality you need. Depending on the type of data and the amount of data, there are many storage technology options you can choose from. However, if you have a large amount of data, particularly unstructured data, the best data refinery for you is likely going to be some type of S3-compatible object storage solution. I recommend you watch our Tech Tuesday webinar: What Your Storage Vendor Isn’t Telling You About S3 to hear what you should ask potential storage vendors.Why Pool Data in an Object Storage Solution?
One of the fundamental benefits of using an object storage technology is that you can pool massive amounts of data into one repository, thus eliminating the archaic “data silos” that many organizations still struggle with.
Once you have pooled that data, you open up all sorts of possibilities. You can start to extract value from your assets and business intelligence from your conglomeration of data.How Does Object Storage Work?
If object storage is the refinery, how does it store, protect and manage data? If you want to refresh your memory about how object storage is different from other types of storage (e.g., block and file), watch our Back-to-Basics Webinar or read our Back-to-Basics blog. You may also want to check out the Storage Switzerland eBook: NAS vs. Object—Which is Best for Your Data Center?Object Storage Protects Assets and Data
In last week’s blog, I mentioned the footage of Elton John that was used in the pre-show to the rock biopic Rocketman. This week, The New York Times reported that decades of Universal Music Group treasures burned in 2008 with casualties including original recordings from stars such as Ella Fitzgerald, Aretha Franklin, Elton and Nirvana. What could have been done to prevent this catastrophic loss?
Data protection is a core function of Swarm object storage. Swarm leverages cluster resources to protect data all the way from bit errors to natural disasters. While many storage vendors tell you that data loss in storage systems is completely avoidable, the truth is that it is right up there with death and taxes.
However, by selecting the appropriate storage system for your assets and data and applying appropriate parameters, you can minimize the probability of data loss. To understand how Caringo Swarm protects data, read these whitepapers:Metadata enables the refinement of data
Metadata, that is, data about the data, enables endless possibilities to unlock information and identify trends that can transform your business. In Caringo Swarm, our metadata is directly stored with the object. Learn more about metadata by watching our Tech Tuesday webinar or reading the summary.Elasticsearch lets you search metadata
Elasticsearch is a distributed, RESTful search and analytics engine that, when used with object storage, enhances metadata searching operations. Each Search Feed indexes metadata in Elasticsearch. In Swarm, Search capabilities map one to one with S3 metadata. This brings a number of benefits such as:
- The ability to derive actionable insight from targeted analysis
- Dynamic organization of content using classification, key words and descriptive content, with multiple ways to track that content
- Integrated search stack optimized within the storage system
When it comes to managing all of that data, you need the right tools for the job. Over the years, we’ve worked to make that task simpler and more efficient. To learn more, check out these three Tech Tuesday webinars:
- Using the Swarm Object Storage Content Portal UI
- Using the Swarm Object Storage Content Portal
- Monitoring Swarm Object Storage Using Prometheus Exporter & Grafana
With different types of businesses and organizations, the requirements for storage and the appropriate strategy vary. Carefully architecting your solution and doing a proof of concept to ensure compatibility with your existing systems is an important step as you investigate new technologies and solutions. If you have questions, I’d like to offer you the option to do what I do: talk to our experienced Object Storage experts. Just contact us and we will be happy to answer your questions or set up a customized demo for you.
Here in Austin, TX, the kids are out of school, temperatures are starting to soar and it is time to escape that heat by heading for the movie theatre. Rotten Tomatoes Rocketman Critics Consensus described the movie by saying, “It’s going to be a long, long time before a rock biopic manages to capture the highs and lows of an artist’s life like Rocketman.”Pre-Show Entertainment at The Alamo Drafthouse
At the Alamo Drafthouse (a little movie chain that started up in Austin over 20 years ago), the movie pre-show consists of many interesting clips from Elton John performances, and even William Shatner interpreting the lyrics to Rocketman for a 1978 SciFi Awards Show after being introduced by no other than Bernie Taupin, Elton’s longtime lyricist. (The link shows a clip on YouTube for those of you with five extra minutes to spare and are feeling particularly brave today.)No Spoiler Alerts Here
Once the movie begins to roll, you see a highly stylized and intense film that takes you on the rollercoaster ride of Elton John’s life, providing tremendous insight into his artistry and the contributions that he and Bernie Taupin have made to music, culture and community over the past fifty years. If you love the music of Elton John, run—don’t walk—to see this on the big screen.What Does Rocketman Have to Do with Object Storage?
I’m glad you asked. First, all that original footage used in the movie pre-show had to be recovered from some type of storage solution (likely, much of it was sitting on tape or in the cloud). Secondly, and I promised no spoiler alerts, the special effects in the movie are spectacular. Not in an action-packed, Marvel Universe or DC Comics style, but in an artistic and passionate way that makes this film gut-wrenching one moment and inspiring the next.
That, of course, leads me to ask the question, what technologies enable the sharing of historical events and the making of new films? How does Object-Based Storage like Caringo Swarm fit into the picture? (Pun intended.)How Can Object Storage Help?
At Caringo, we see the need for storing digital video growing daily, and we hear from post-production houses and visual effects editors that they need a cost-effective, reliable platform for archiving footage that is searchable and provides instant access to video clips. They want to be able to seamlessly tie this into asset manager integrations such as CatDV, Marquis Project Parking, Cantemo and Vidispine. And, just as importantly, they want to safely store these assets indefinitely.
Understanding Object Storage Technology
Whether retrieving historical footage or creating new films, enabling efficient file movement is critical, as is being able to find and retrieve clips. Learn more about how Caringo Swarm plugs into asset management solutions with the S3 API by reading this blog or by watching our recent webinar: How to Enable Video On-Demand in Workflows.
Check out our many educational object storage resources designed to help you understand Object Storage technology and the many benefits it can bring to your business or organization. Having pioneered Object Storage technology, we have a staff of highly experienced Object Storage Engineers who are happy to talk to you about your specific needs and help you architect the solution that will work for your environment. As a bonus, with our continuous built-in data protection and easy-to-manage data storage platform, Swarm Object Storage will leave you with the peace of mind and time to go soak in some of that summer fun!
At Caringo, we pride ourselves on making something complex—i.e., storing and accessing TB–PBs of data and billions of objects on heterogeneous hardware—
easy to manage.
The key to this is visibility into system status. We have had the option to use SNMP and Nagios for many years; however, we kept getting requests for an intuitive way to monitor Swarm object storage using more current monitoring and visualization platforms.
Integrate Elasticsearch into Swarm Object Storage
We took the first step in this direction in 2016 when we first integrated Elasticsearch into Swarm. Elasticsearch is an open-source RESTful search engine built upon Lucene. This provided us with a scalable way to index, view and search system and custom metadata.
Launch Prometheus Node Exporter for Swarm
The next step to make visual system status possible happened in April of 2019 when we launched the Prometheus node exporter for Swarm 10. Prometheus is a popular open-source monitoring solution that enabled us to export Swarm-specific metrics to Elasticsearch.
Using Grafana Open-Source Metrics Visualization Platform
The third and final step was using Grafana—an open-source metrics visualization platform. We were able to leverage some of the existing Grafana templates to quickly visualize Swarm metrics over a customizable period of time. Our technical staff has been using this internally for a few months to optimize Swarm cluster configuration with excellent results.
Demo: Monitoring Swarm Object Storage with Prometheus & Grafana
In our June 11 TechTuesday webinar (at 7am PT/10am ET), Monitoring Swarm Object Storage Using Prometheus Exporter & Grafana, John Bell, Senior Consultant, will host Abraham “Avi” Felsenstein, System Integrator. Avi will demonstrate how to import data from the Swarm Prometheus node exporter and view it via Grafana.
The post Visualizing Swarm Object Storage Status with Prometheus & Grafana appeared first on Caringo.Related posts:
One of the most popular uses of Swarm is for storing, archiving and delivering
In a recent blog, I detailed the market conditions that are driving customer requirements. The blog was about sports video; however, we are seeing the same trends in every market that relies on digital video (post production, film, surveillance, houses of worship and corporate training to name a few).
The biggest challenge for many professionals today is enabling “on-demand” in existing workflows. Said another way, they are struggling with providing instant access to digital video and delivering it immediately to any device. At the heart of enabling “on-demand” is efficient file movement. This blog details a few of the specific features and interfaces Swarm employs to enable efficient file movement. The first step in efficient file movement is being able to integrate with Swarm via your existing applications and workflows.
- Sustained Data Streaming
- S3 Support
- S3 Support Importance
- Parallel Uploads
- Range Reads
- Partial File Restore
- Wrap Up
Unlike many gateway and file system interfaces on the market, SwarmNFS is a file-to-object converter. It provides a mountable volume for your NFS or SMB applications and converts files to objects in a lightweight fashion (in flight) without spooling or caching. Therefore, you can use NFS, SMB or even S3 to read, write, modify or access a file, enabling true multi-protocol access. In a recent benchmark, a single instance of SwarmNFS delivered 1.56 GB/s read performance on commodity hardware.Continuous S3 Support
The Amazon S3 API has become the de facto object storage interface. I stress “de facto” because it is technically not a standard. That said, we spend a lot of time making sure we stay as true to the specification as possible.Why is S3 Support Important?
S3 support is important because beyond standard storage protocols like NFS and SMB, the S3 interface is the way that most ISVs, data movers, and asset managers integrate with the cloud and on-premises object storage. Caringo’s object-based software-defined storage solution Swarm supports the Amazon S3 API through an extensible architecture, which later can be used to seamlessly support additional third-party APIs. A broad range of applications that currently support the Amazon S3 API work directly with Swarm. If you are interested in learning more about how Swarm plugs into asset management solutions via the S3 API, you should attend our webinar How to Enable Video On-Demand in Workflows on May 30 or watch afterwards on demand. Caringo will be demoing all of the asset manager integrations we highlighted at this year’s NAB 2019 Show, including CatDV, Marquis Project Parking, Cantemo and Vidispine. Or Caringo’s webinar on May 28 What Your Storage Vendor Isn’t Telling You About S3, where Caringo industry experts will discuss the secrets and important details your storage vendor isn’t telling you.Parallel Uploads
From an architecture perspective, Swarm employs a parallel approach—that is, all nodes can perform all operations. This makes multi-part or “parallel” uploads an efficient way to ingest files and it also streamlines combining the multiple parts of a file once on Swarm.Range Reads
The file movement benefits of Swarm aren’t just on ingest, but also provides efficient ways to access data, as the native interface to the software is based on HTTP. Swarm object-based storage device and software enables range reads, offering an application like a video player; the ability to specify the exact location of a file to start a playback operation. This eliminates the need to download or cache the undesired portion of the file.Partial File Restore
Partial File Restore in Swarm (currently in beta with general release scheduled for Fall 2019) is a well-known feature in the M&E world that Caringo is bringing to object storage. Partial File Restore for object storage enables the ability (via a web-based UI or the API) to specify a portion of a video file you want and then create a clip of only that portion. That clip can then be moved to a specific application, downloaded by authorized users, or streamed directly from Swarm to authorized users, employees, subscribers or viewers. Get a personalized preview of Partial File Restore before Caringo’s launch later this year.Wrap Things Up
This is just a short list of interfaces and features that enable digital video professionals to leverage the benefits of Swarm software-defined storage for efficient file movement into, within and out of Swarm while plugging into existing workflows. In addition to the resources I listed above, we have a growing library of on-demand webinars and highly informative blogs. With the Swarm 11 release just around the corner, our field-hardened object storage contains far more features than the few highlighted here. If you are interested in a full overview, don’t hesitate to reach out to Caringo with questions about your specific use case or to schedule a private demo!
In 2019, Caringo has continued to lead the way in the object storage industry by providing insight into the inner workings of our technology and industry trends, along with experience gleaned from 100s of successful object storage implementations. We know you have a lot of choices to make about how you store data, and that your organization most likely needs more than one type (or tier) of data storage technology. Following is a collection of resources to help you make informed choices when the time comes to integrate a cost-effective tier of storage for access, distribution and archive.On-Prem or Cloud, S3 Storage Rules in 2019
Whether you are looking for an on-premises solution or an effective way to tier data to a public cloud like AWS, Microsoft Azure or Google, you’ve most likely come to the conclusion that you need all of your storage to be S3 compatible.
If you want to learn more about S3 API support in object-based data storage, register for our May 28 Tech Tuesday webinar: What your Storage Vendor Isn’t Telling You About S3. You will have the opportunity to ask questions during the webinar of John Bell, Senior Consultant, and Eric Dey, Director of Product. (Alternatively, you can watch the webinar recording on demand after the live event).
Managing and Moving Data Between Storage Platforms
Managing your data and moving it between various storage platforms can at times be problematic. This is particularly the case when you use storage products that are designed to keep you continuously purchasing expensive hardware to stay ahead of your data growth.
To help our customers combat this and enable them to dramatically cut their storage TCO, we’ve evolved our tools to enable you to manage data and move it from SAN, NAS and tape into object or cloud storage. Check out the Tech Tuesday Using FileFly to Manage Your Data with Azure, Google, Amazon or Swarm webinar on demand to learn more.
The Magic Behind Metadata
Metadata (the “data about data”) has always been a passion for us at Caringo, and we’ve made unique strategic choices about how we manage metadata to ensure our users can unlock the intelligence potential that resides in large data repositories. Ryan Meek, Principal Solution Architect, is our metadata expert, and he talks to John Bell about it in the Tech Tuesday webinar Using Metadata with Object Storage.
Storage for Video Workflows
An arena where we have recently seen tremendous growth of data is in Media & Entertainment. Whether music, gaming or movies, our society today relies on technology not just for work but for play. Our engineering team is constantly working with our technology partners and customers to enable workflow solutions, particularly for video. Ryan Meek and Sales Engineer Jose Juan Gonzalez Marcos discussed this topic and gave a demo of how our object storage works with Media Asset Management (MAM) products in the How Storage Streamlines Workflows in the VOD/OTT Era webinar.
Jose will be back on May 30 with VP of Marketing Adrian “AJ” Herrera for a webinar where he will provide a demonstration of how to enable on-demand workflows. Register now to watch live or on demand.
What Do You Want to Learn About?
What topics would you like to see us cover in 2019 and beyond in our webinars and blogs? Email us at email@example.com with your requests and/or questions. We are always happy to provide information you need to choose the right storage solution for your business or organization.
The challenges of enabling “on-demand” workflows are being felt across every industry driven by digital video. However, those who have large content archives or are struggling with supporting live events are facing particularly challenging issues. Sports video professionals need to deal with both. In this blog, I will give a high-level overview of how object storage enables “on demand” for sports video workflows. First, let’s level set on the definition of “on demand” and the resulting requirements.What Does “On Demand” Mean and What Does It Require?
On demand is the enabling of delivering content at the end-user’s convenience. Depending on where you sit in the sports video lifecycle, your end user is different. If you are on the production side, your end user may be VFX or colorists, or possibly your client requesting a new project that reuses clips from previous games or episodes. If you are in broadcast, maybe the end user is a regional station or a subscriber. Or, if you are a sports team, maybe your end users are producers, executives, coaching staff, trainers or athletes. What delivering content at their convenience boils down to is that (1) you can find the file and (2) you can stream or deliver it to the required application or device when they request it.How Does Object Storage Enable On Demand for Sports Video?
When evaluating data storage solutions, it comes down to your budget and requirements. It’s reminiscent of what Billy Bean did with the Oakland Athletics in 2002 by using sabermetrics. The A’s had a $44M budget, the third lowest budget in Major League Baseball (MLB) at the time, with the Yankees’ $125M budget being the highest. What Mr. Bean and his staff realized was that the traditional subjective form of recruiting often fell short and if he focused on what led to scoring, on-base and slugging percentages, he could pick up undervalued players that statistically had a chance against teams at the top of the budget scale. This led to their famous 20-game winning streak.
If your focus is enabling on-demand access or providing economical, anytime access to content, object storage is your best storage option. Object-based storage maximizes the efficiency of your budget by leveraging commodity hardware (similar to how Mr Bean maximized the efficiency of his budget via undervalued players), delivering cost-effective, scalable storage that includes self-healing, rapid recovery, automated management, built-in replication (and other features) with instant content access. Best-of-breed solutions from object-based data storage vendors like Caringo have search, parallel uploads and direct streaming as standard features. This is why object storage is the enabling technology behind every cloud storage service and why there is an object store at the heart of every major video-on-demand service today (Netflix, Hulu, Amazon Prime and others).Object Storage in Sports Video Workflows
So exactly where does object storage fit into Sports Video workflows? Below are a few diagrams that show where object storage would fit into workflows for private streaming and longtail video on demand (VOD), tape storage replacement and centralized backup. The sections in purple with the Caringo Swarm logo indicate where the object storage solution is deployed.
As with any data storage technology, being able to use object-based software-defined storage boils down to the protocol or interface. Historically, object storage was accessed through a proprietary RESTful interface. To interface with object-based storage, an application developer had to integrate to the vendor’s API. In layman’s terms, this meant object storage didn’t just work out of the box with applications like file-system-based solutions that relied on SMB/CIFS (Windows) or NFS (Linux). This led all object-based storage vendors to create interfaces for SMB/CIFS and NFS.
However, the tipping point for object storage was the proliferation and support of the Amazon S3 API. Now, just about every current application used by Sports Video professionals either supports the Amazon S3 API already or will in the next year or so. Support of the Amazon S3 protocol means you can use both the Amazon S3 cloud service or any on-premises object storage solution that supports S3.Where Can I Learn More About the S3 API in Object Storage?
If you are interested in learning more about S3 API support in object-based data storage, register for our May 28 Tech Tuesday webinar: What your storage vendor isn’t telling you about S3. (if you are reading this after May 2019, watch the webinar recording on demand).How Do I Know If My Organization is Ready for Object Storage?
- It is taking too long to access video and project files from your archive
- You need to stream or share internal video but don’t want to use a CDN and the files are too large to email or FTP
- You need to support workflows that need access to archived content via S3, NFS and SMB
- Cloud storage and NAS are too expensive and tape storage recall times are too long and difficult to manage
Research aside, if you identify with any of the above statements then your organization is probably ready for object storage.How Can Caringo Data Storage Help Sports Video Professionals?
With all that said, object storage isn’t a panacea, but it is an increasingly important storage technology that enables on-demand access. We have some great resources that can help you understand the differences between data storage tiers, block storage vs file storage vs object storage, and how to migrate from tape storage to object storage.
As we’ve been introduced to sports teams, broadcasters, entertainment venues and universities, we started hearing a lot of common themes in their challenges. Here at Caringo, we started our search for an organization where we can learn more and where our expertise can be leveraged to solve these challenges. This led us to the Sports Video Group (SVG), and we are proud to be one of their newest members. SVG is a group that was created to advance the creation, production and distribution of sports content. If you will be at the upcoming Sports Content Management Forum in NYC on July 24 and would like to meet, let us know! Or, you can schedule a consultation with us at any time.
The post Object Storage: Enabling On-Demand Workflows for Sports Video appeared first on Caringo.Related posts:
Everywhere you look these days, there are articles about new scientific breakthroughs. The storage world is abuzz about the the first real picture of a supermassive black hole. This black hole is at the center of galaxy M87, and it took about 3.5 PBs of data to generate the picture. In total they collected 5PBs, that is 5,000TBs or about 625x8TB drives. Amazingly enough, hard drives with the data were shipped via airplane from different locations to be consolidated!
Why in the world would you ship data around on drives instead of using the cloud or FTP? The problem was not storage capacity (5PBs is easy enough to store in AWS). The issue was the transferring of this amount of data in a reasonable time frame.A World Full of Data
Scientific data is often collected from an eclectic mix of sources, and can easily fall victim to the age-old curse of storage silos.
Consider just a few of the various sources for data:
- Historical records on various types of storage (from handwritten notes to archival tape to various storage platforms)
- IoT devices, telemetry units, telescopes, etc.
- Surveys & Interviews
- Observation (by researchers or by video)
As Science progresses, research organizations around the world strive to arm their researchers with the technology to continue making advancements, and data storage is an important tool in the world of high-performance computing (HPC). However, similar to the point made in AJ Herrera’s recent blog What are the 5 tiers of Storage for New Video Production Workflows, one tier of storage does not fit all.
In a research setting, a well-designed storage infrastructure integrates various tiers (or types) of storage to enable the collection, storage and analysis of scientific data. However, recent advances in globally distributed workflows and the resulting access requirements are driving a paradigm shift from distributed and parallel file systems to object storage.Can Object-Based Data Storage Replace Parallel File Systems?
“Yes! For read intensive workloads” concluded CEO Tony Barbagallo when he posed the “Can Object Storage Really Replace Parallel File Systems?” question in our blog. To say this another way, object storage (on the appropriate underlying infrastructure) can enable high-throughput managed access to research streaming distributed access to data and reducing time to discovery. An example of this is how the UK’s Science and Technology Facilities Council (STFC) Rutherford Appleton Laboratory (RAL) Space uses Caringo Swarm Object Storage as part of their JASMIN super data cluster. Prior to selecting Caringo Swarm, STFC performed extensive benchmark testing on a number of Object Storage Solutions to determine which best met the requirements for the project.
It is going to take more than just massive amounts of data storage for the scientific community to streamline distributed collaboration. It will take a coordinated approach between storage, networking and data analysis tools, such as those provided by our partner Globus. Globus is a secure, reliable research data management service used by thousands of organizations to move, share and discover data via a single web browser interface.
Learn more about Globus and how it works with Caringo Object Storage to solve issues by combining the benefits of S3-enabled private cloud storage with secure, reliable research data management services by reading our solution brief.
The post Scientific Breakthroughs and the Role of Data Storage appeared first on Caringo.Related posts:
As your digital video workflows grow in scope, your underlying storage strategy must adapt. The days of buying one tier of storage for editing and one tier for archive are quickly coming to an end due to rapidly evolving asset reuse, globally distributed workflows and on-demand delivery requirements.
- On-Going Challenges
- Storage Requirements
- 5 Tiers of Storage
- Mapping to Requirements
- Mapping to Workflows
So, how do you define your storage strategy to create better, more cost-effective workflows? What are the characteristics of each tier of storage needed? And, what are the variables to consider when defining how much of a specific storage tier you need?What are Some Challenges of Video Content?
1080P workflows are quickly turning into 4K workflows with 8K workflows starting to take hold and 16K cameras looming on the horizon. Therefore, both the size of the image (the resolution) and the size of the resulting files are growing. In addition, uncompressed workflows are now being requested, resulting in multi-TB sized files.Aspects of Creation & Consumption
The rise of video on-demand (VOD) services, broadband and mobile devices have changed the way content is produced and consumed in several ways, including:
- Consumers now view content when they have time, on the device that’s most convenient to them
- Resolutions range from mobile (720, 1080, 2K) to 4K with 8K TVs hitting the market
- You now have to create various versions, dimensions, resolutions for the exact same piece of content
With video being repurposed and reused, nothing is thrown away, and the need to protect content and keep it accessible is critical. Here are some of the factors that intensify the need for protection and access:
- Video content has intrinsic value well beyond the initial use
- All project files, source footage and produced content need to remain instantly accessible
- On-demand workflows are straining workflows based on tape, it takes too long to recall assets
- Speed: What is the speed required to perform your particular task? If you are editing 8K footage in an uncompressed format, you are going to need really fast storage. If you need playout functionality playing back compressed formats over the web, you don’t need as much speed. The goal is to match the speed to the task.
- Length of File Use: How long are files going to be used for their immediate purpose? How long do they have to stay on a particular tier of storage?
- Repurposing: When is a specific file going to be reused? Is it within 6 months?
- Archiving: If you know if and when a file will be repurposed, you can develop your archive strategy. For instance, if you are not going to use a file within 6 months, perhaps it’s time to move that file to a secondary or tertiary tier of storage.
Once we look at our workflows and file use characteristics through the lens of these requirements, we can start to determine what capacities we need for each specific tier of storage to complete our task. Now let’s take a look at the 5 tiers of storage available for new video workflows.What are the 5 Tiers of Video Data Storage?
Ultra fast NVMe is similar to RAM. This is the type of storage needed for 8K or 16K uncompressed workflows since it is faster than SSDs. You can get content off the drive quickly due to low latency and do it with less drives than used in a traditional RAID array. However, networking requirements are massive, along with the resulting compute and overall price tag. This isn’t a cheap solution which is why you buy the amount needed for your specific task.Ultra-Fast SAN/NAS Architecture
This is your more traditional SAN and NAS environment needed for high-speed editing platforms. They are fast and reliable and optimized for video workflows with large capacity drives and arrays with a multitude of connectivity options (10 Gig, 40 Gig, Fibre…). However, networking upgrades may be required. There are high costs of ownership due to power/maintenance, and performance degrades as they near capacity. These factors are behind the need to eventually offload files to different tiers of storage.NAS & Filer
If your content has a shelf life of 3–6 months then you may want to consider putting it on a NAS or filer. They are relatively low cost (vs NVMe and Ultrafast NAS/SAN) with Petabyte-level capacity. However, they have limited scalability beyond multiple Petabytes, are often not accessible over the web, and have long rebuild times in the case of a sector or drive failure. In addition, support and maintenance fees in subsequent years can often exceed the original purchase price.Local Archival
For on-prem and local archive, you have two options. LTO (tape) or Object Storage (HDD based). The benefits of LTO are low cost per TB with virtually unlimited tape capacities. However, you are locked into a tape format, maintenance is often an issue, and if the files are stored across tape drives it can take a long time to recall assets. The benefits of object storage are also the low cost per TB with a cloud-like low maintenance experience and dense capacities. However, object-based storage requires available data center footprint and (like LTO) an HSM or data mover to plug into certain file-system-based workflows. For most organizations, dealing with new video production workflows and on-demand requirements, object storage is the tier of storage with the most rapid rate of growth.Cloud Archival
The cloud also employs both LTO and object storage technologies to enable archive services. The benefits include zero maintenance with unlimited capacity and instant deployment. The benefits, however, come at a cost that compounds over time and is always more expensive than on-prem solutions if you are keeping assets for more than 2–3 years.Which Storage Tier is Best for My Requirements?
Depending on your requirements, you can match the appropriate type of storage. This is illustrated in the chart above. The circles in the chart above represent the amount of time that files are going to sit on each tier of storage, based upon the requirement. For instance, if speed is the primary requirement then files will sit on NVMe or Ultra-Fast NAS/SAN. However, if archive is the primary requirement, the file will most likely be on your local archive or cloud archive tier. This table is another tool you can use to help you map your current workflows to the associated tier of storage.Which Storage Workflow is Best for Video Production?
Similar to the previous matrix, this table shows what storage tier is ideal for a specific workflow. For instance, for production you would most often use NVMe or Fast NAS/SAN. For video-on-demand (VOD) or over-the-top (OTT) enablement, you would most often use a local archive or the cloud. You can also use NAS for VOD and OTT; however, you would need to add additional layers of infrastructure like a web server. For archiving projects or masters, a local archive or the cloud are the clear choices.Learn How Storage Can Streamline VOD & OTT Workflows
Watch this on-demand webinar to learn how storage streamlines workflows in the VOD/OTT era. You will see how you can stream content directly from storage, how to set up multi-site content distribution for collaboration, and how to unify file systems like NFS and RESTful S3 workloads. In addition, you will gain an understanding of the pros and cons of using object storage vs tape storage.Conclusion
We hope that the information covered in this blog and the different tools presented help you define your storage strategy. As always, we and our partners at JB&A Distribution are here to help. If you have questions or would like to schedule a custom demo, contact us today.
The number of tiers of storage you deploy and their resulting capacity is all going to come down to your budget and the requirements determined by a needs analysis. If you are struggling with editing 8K or 16K uncompressed files, then you should take a look at NVMe. On the other hand, if you are struggling with VOD/OTT enablement or keeping archives instantly accessible, you should take a look at object storage or the cloud.
The post What are the 5 Tiers of Storage for New Video Production Workflows? appeared first on Caringo.Related posts:
Quick. Fast. Performant. We often hear requirements for high performance when talking about storage. When we drill deeper, that translates to “fast, we need it to be fast.” This response is common and troublesome.How Fast is Object Storage?
Fast is relative. Compared to a semi-truck, a Ferrari is fast. But, each serves a different purpose. When it comes to storage, measuring “fast” and what that means in the real world can be a complicated endeavor. There are two metrics measured, input output operations (IOPS) and throughput.IOPS vs Throughput
IOPS measure the speed of operations, which is a useful measurement of performance for file-system-based solutions since files are essentially shredded into thousands of pieces and need to be stitched together quickly on read. Throughput, however, is more about the total amount of data that can be read from a storage system.
IOPS is a great measurement to determine if you can support a specific application—like a 4K or 8K editing suite (as one example). Alternatively, throughput is often used in the context of how quickly you can deliver content to different applications, clients and users.
For object storage, throughput is the main performance characteristic to measure with IOPS being a secondary measurement.Throughput of Object Storage
When measuring the performance of an object storage system, not only must raw throughput be taken into account for both data ingest and retrieval, but data protection and recovery speeds need to be accounted for as well. What use is a massive ingest rate if the data is not protected? Being able to claim that you can lose data faster than the competition is not smart marketing. And, not being able to keep up with a client workload is problematic.Why Conduct Performance Testing of Object Storage?
While being able to quote large throughput numbers is impressive, performance testing storage systems helps us out in many other ways as well. It helps us fine tune the software and underlying mechanisms. It shows us how the software scales as environments grow in capacity and complexity. Most of all, performance testing lets us know if the storage solution will meet the needs of the customer workload, as it did in our recent object storage performance benchmarking for the UK Science and Technology Facilities Council’s (STFC) JASMIN super data cluster. Download the STFC Object Storage Benchmarking Case Study & Whitepaper.Lies, Damned Lies, and Statistics
It has been said, “there are three kinds of lies: lies, damned lies, and statistics.” One thing to watch for when reviewing performance statistics is how the testing was performed. For example, if a system makes impressive numbers and performs fantastically with 100 byte files but performance falls off dramatically for files over 1kB then the usefulness of that system would be very limited. Or was the testing performed in simulation only vs using real world tools and data? What does the test data actually measure, writing to cache or the final write to the target media (usually HDD for object storage)? Useless metrics are just that⏤useless.Learn More
To learn more, please tune in to our next Tech Tuesday Webinar: Measuring Performance of Object Storage. Ryan Meek, Principal Solutions Architect, and I will give a high-level overview of use cases for object storage, share best practices for testing, and explain how Caringo tackles performance testing on April 23 at 11am PT/2pm ET.
As attendees of the 2019 NAB Show in Las Vegas head home this afternoon, we wanted to share our reflections on the event with you. No other event floor is as vibrant and exciting as the Las Vegas NAB Show. With Creative and IT Professionals from verticals spanning Media & Entertainment to Government to Houses of Worship and everything in between, our team heard a solid theme emerging this year. The need for object storage is on the rise and the audience now understands the value of object storage.You Get Us (Object Storage Resonates)
Many attendees stopped and looked thoughtfully at our booth message: Caringo Object Storage for Access, Distribution & Archive. It was crystal clear this year, the attendees “got us.” After reading the signage, the attendees quickly jumped to asking probing questions about our technology.
Many of those questions were fielded by our knowledgeable Sales & Marketing team members (Ben Canter, Adrian “AJ” Herrera, Paul Phillips, David Fabrizio, Jerry Tohtz and me). However, those questions often led to in-depth conversations (in English and Spanish) and demos with our Principal Solutions Architect Ryan Meek and our Sales Engineer Jose Juan Marcos Gonzalez, with CEO Tony Barbagallo stepping up to the plate as needed.What is Object Storage, and How Can it Help My Organization?
Our longevity in the object storage market (Swarm object storage technology is now field-hardened at version 10.2) provides us with the edge of having the most experienced object storage engineers in the industry. In our mission to make certain you fully understand object storage, we give you access to our technical staff both at in-person events and in our Tech Tuesday webinar series.
Caringo has a highly distributed global team with headquarters in Austin, Texas. For this crew, we pulled from three countries, two continents and four states. Seamlessly, we moved from being a virtual team to a physical team. Like a well-choreographed dance, we welcomed visitors to our booth and introduced them to the staff member who could best provide the information needed. In that process, we all got to know and like each other even more. We even rang in a landmark birthday for David Fabrizio, who has been at NAB for each of his birthdays over the past two decades!Meeting Friends—Old and New
At the NAB Show, we make new friends and reconnect with old friends in the industry. There are few places better than Las Vegas to have a good time, and we hope you all enjoyed it as much as we did. See you at the 2020 NAB Show! And yes, we plan to bring signature purple light-up yo-yos once again.
As we enter count-down mode for the 2019 National Association of Broadcasters Show (NAB Show) in Las Vegas, I’d like to share a little secret with you. I’m not a gambler. If you read my blog last year, you probably already guessed this.Protecting Your Assets: The Stakes Are High
Just how high are the stakes? No matter your industry, organization or data, you could jeopardize content as well as company earnings and reputation if data is not properly secured and cannot be accessed when needed. When you are storing and working with pre-production video footage, how many hours and dollars did it take to record? That price skyrockets even higher once your master asset has been produced and finalized.Optimizing Collaboration and Storage
When you research storage solutions, are you also considering how your data will be used in creative collaboration processes? Long gone are the days when you could store assets in data storage silos and expect to keep your workflow moving efficiently. At the NAB Show, Media & Entertainment (M&E) IT professionals from Studios, Production Houses, Broadcasters and Service Providers at the NAB Show will once again be looking to optimize storage and access at every stage of the digital asset lifecycle—from production to delivery to long-term preservation.What’s New at the Caringo Booth This Year?
Over the past year, we have remained committed to making certain you can store what you need, ensure media integrity and keep assets online and accessible. Caringo provides S3- and NFS-accessible object storage designed for content access, distribution and archive. It can be purchased as software only and run on any x86 server, or you can now purchase it in an on-prem S3-compatible single Swarm Server Appliance.
In addition, we will be showcasing version 3.0 of our FileFly Data Management Tool, which now enables you to move data from NetApp and Windows Filers to Amazon (AWS), Microsoft Azure and Google cloud as well as to Caringo Swarm Object Storage. (Make sure to ask us about the new Free 25TB Community Edition of FileFly.)What Can I Learn at the Caringo Booth This Year?
Our expert storage architects and engineers will be on hand to help you understand how Caringo’s object storage technologies can help you conquer the challenges of scaling storage in the on-demand world. Our experts can show you:
- Internal streaming and longtail video-on-demand (VOD) directly from the archive layer that’s lower cost than cloud-based services.
- Tape replacement that provides guaranteed content availability with minimal administration.
- Geo-dispersed collaboration platform that plugs into your asset manager or can be used as a stand-alone solution.
- Single target for multiple data sets on a future-proof platform that delivers unlimited scale.
Coming to Vegas early? Get a preview over beer and pizza Saturday or Sunday at the JB&A Pre-NAB Technology Event. Register now.
Then, stop by our booth at the NAB Show expo (#SL13310) to learn more. You can also visit https://www.caringo.com/solutions/media-entertainment/ or contact us to schedule a demo.
The post NAB Show 2019: With Object Storage, You Don’t Have to Gamble appeared first on Caringo.Related posts: