Using AI for Metadata Creation
High quality metadata plays an outsized role in improving enterprise search results. But convincing people to consistently apply quality metadata has been an uphill battle for most companies. One solution that has been around for a long time now is to automate metadata's creation, using rules-based content auto-classification products.
Back in 2004, I ran a large, greenfield enterprise content management program for a big UK university. I was lucky to work with information management experts in the university library and a member of the W3C metadata group on the project. Together we undertook an evaluation of automated content classification tools. The library team did not like any of them at all. They thought there were too many potential quality issues, and that we would be better off building metadata tagging into our content creation workflows.
So what has changed in the 15 years since that project? A lot.
AI and Metadata: The Rise of the Machines
You don’t need an army of Cyberdyne Systems T1000 Terminators to sit and tag your content — the reality of AI is somewhat different than what's depicted in movies. AI tools can be broken down into a set of broad technology capabilities:
- Neural Networks: Weighting and sifting inputs like biological neurons, to identifying patterns and predicting behaviors.
- Natural Language Processing: Understanding human patters of language in text and speech in audio files.
- Statistical Machine Learning: Analysis of data sets through application of statistical models.
- Deep Learning: Sifting through many levers of variables or features to fit models to data for extraction of meaning, pattern recognition and comparison of complex concepts.
Depending on the product, it will apply different elements of these technologies to process your content in order to create and add new metadata. That additional metadata can both classify the content items and improve findability by adding keywords or other descriptors.
For decades search engines were better able to handle textual content than other content types, such as images, audio or video files. This is because vendors created mechanisms for “opening” the file — be it a Microsoft Word or Excel file, a PDF or a .txt file — and indexing the content as well as indexing any metadata. Search engines could not however index the content of a .jpg image, an mp4 video, or an mp3 audio file.
Related Article: A Multi-Pronged Approach to Intranet Content Curation
A Rapidly Growing Market
Many specialist vendors as well as major content services providers now offer AI-based products that can analyze your content. For a lot of the big vendors, that means cloud services. Microsoft, Google, Amazon, IBM, OpenText, Oracle and many others provide cloud AI services you can plug into. Meanwhile, some of the specialist content analysis or general AI start-ups still offer licensing to build servers in your on premises data centers. So whether you're all in on the cloud, or work in a conservative regulated industry, you should be able to find a product that works for you.
Applying machine learning and NLP to content analysis can improve the findability of these previously hard to index rich content types through the automated creation of high quality metadata.
The field of medical imaging provides a well-known use case case for image files. Machine learning and deep learning have been used to teach AI the interpretive skills of an experienced medical imaging technician or radiologist in analyzing X-rays, MRI or CT scan images. However the same technologies can be applied to a much older business use case: optical character recognition of scanned and digitized paper documents, or even handwriting recognition on such images. Although we are talking about analysis of the content as part of a business process here, there is no reason why the processing should not also generate additional metadata for tagging the content item.
How Liberty Mutual Drove Global Efficiency with Drupal as the Foundation for their Intranet
How the Liberty Mutual team is approaching personalization in the workplace.Register
Workplace Origami: Making Sense of the Forces Shaping the Workplace in 2021
Effective human capital management practices are key to thriving during the workplace revolution.Register
Making the Employee Experience Empathetic to Frontline Workers
Learn how leading organizations use EX tools to connect people with the resources they need in the field or on the move.Register
If Employee Experience Isn’t Your Department’s Top Priority, It Should Be
Learn how to build a work environment that enables people to do their best work and creates more satisfied and productive teams.Watch Now
Related Article: When it Comes to Intelligent Search, Don't Expect Magic
When we speak to Google Home, Amazon Alexa or Apple Siri, we all understand AI tech is being employed to understand our speech. The same neural network and natural language processing technology can be applied to audio files, such as the recording of call center voice calls, dictated memos, or even the audio track of video files, to pull out specific concepts and create keywords. It can also provide time indexes, for example, “at 3 minutes and 40 seconds into this recording the contact center operator mentioned credit cards.” The ability to automatically tag audio files with topics, categories, keywords, or even the name of who is speaking, will massively enhance their findability.
Merging the technologies used to analyze both still images and audio, we gain many of the same advantages for video files. Think of the improvements to findability and utility of these files when you can time index a video recording of a remote meeting to understand what was discussed when, or applying facial recognition to tag the video file with the names of those in attendance in the “attendees” field.
AI Isn't a Hands-Off Project
AI-based automated content tagging can enrich your metadata and help improve findability for your content items, but there are some caveats (as with any technology). The key is in the application, as the hype around what can be achieved with AI technologies may create outsized expectations. Not only does it require considerable due diligence and analysis on the part of the organization when selecting a vendor to meet a particular business use case, but there is a distinct governance aspect as well. Transparent governance needs to be in place for any model development for machine learning, logical rules applied to NLP, and more especially in higher risk industries like the medical imaging example discussed above. Even if your AI application is deciphering hand written notes on scanned documents, you'll need a good process for developing the rules applied, and good documentation of how the AI applies those rules to allow for human quality control checking.
But the bottom line is the judicious application of AI technologies to your content could improve its findability and improve search results, so come on, open the pod bay doors HAL!
About the Author
Jed Cawthorne is Director, Security & Governance Solutions at NetDocuments. He is involved in product management and working with customers to make NetDocuments phenomenally successful products even more so.