Skip to main content

MPEG-7

[dummy-text]









MPEG-7


From Wikipedia, the free encyclopedia

Jump to navigation
Jump to search


MPEG-7 is a multimedia content description standard. It was standardized in ISO/IEC 15938 (Multimedia content description interface).[1][2][3][4] This description will be associated with the content itself, to allow fast and efficient searching for material that is of interest to the user. MPEG-7 is formally called Multimedia Content Description Interface. Thus, it is not a standard which deals with the actual encoding of moving pictures and audio, like MPEG-1, MPEG-2 and MPEG-4. It uses XML to store metadata, and can be attached to timecode in order to tag particular events, or synchronise lyrics to a song, for example.


It was designed to standardize:


  • a set of Description Schemes ("DS") and Descriptors ("D")

  • a language to specify these schemes, called the Description Definition Language ("DDL")

  • a scheme for coding the description

The combination of MPEG-4 and MPEG-7 has been sometimes referred to as MPEG-47.[5]




Contents





  • 1 Introduction


  • 2 Parts


  • 3 Relation between description and content


  • 4 MPEG-7 tools


  • 5 MPEG-7 applications


  • 6 Software and demonstrators for MPEG-7


  • 7 See also


  • 8 Limitations


  • 9 Compare


  • 10 References


  • 11 External links




Introduction[edit]


MPEG-7 is intended to provide complementary functionality to the previous MPEG standards, representing information about the content, not the content itself ("the bits about the bits"). This functionality is the standardization of multimedia content descriptions. MPEG-7 can be used independently of the other MPEG standards - the description might even be attached to an analog movie. The representation that is defined within MPEG-4, i.e. the representation of audio-visual data in terms of objects, is however very well suited to what will be built on the MPEG-7 standard. This representation is basic to the process of categorization. In addition, MPEG-7 descriptions could be used to improve the functionality of previous MPEG standards.With these tools, we can build an MPEG-7
Description and deploy it. According to the
requirements document,1 “a Description consists
of a Description Scheme (structure) and the
set of Descriptor Values (instantiations) that
describe the Data.” A Descriptor Value is “an
instantiation of a Descriptor for a given data set
(or subset thereof).”
The Descriptor is the syntatic and semantic definition of the content.
extraction algorithms are
inside the scope of the
standard because their
standardization isn’t required
to allow interoperability.



Parts[edit]


The MPEG-7 (ISO/IEC 15938) consists of different Parts. Each part covers a certain aspect of the whole specification.





































































































MPEG-7 Parts[4][6]
Part
Number
First public release date (First edition)
Latest public release date (edition)
Latest amendment
Title
Description
Part 1

ISO/IEC 15938-1
2002
2002
2006
Systems
the architectural framework of MPEG-7, the carriage of MPEG-7 content - TeM (Textual format for MPEG-7) and the binary format for MPEG-7 descriptions (BiM)[7]
Part 2

ISO/IEC 15938-2
2002
2002

Description definition language

Part 3

ISO/IEC 15938-3
2002
2002
2010
Visual

Part 4

ISO/IEC 15938-4
2002
2002
2006
Audio

Part 5

ISO/IEC 15938-5
2003
2003
2015
Multimedia description schemes

Part 6

ISO/IEC 15938-6
2003
2003
2011
Reference software

Part 7

ISO/IEC 15938-7
2003
2003
2011
Conformance testing

Part 8

ISO/IEC TR 15938-8
2002
2002
2011
Extraction and use of MPEG-7 descriptions

Part 9

ISO/IEC 15938-9
2005
2005
2012
Profiles and levels

Part 10

ISO/IEC 15938-10
2005
2005

Schema definition

Part 11

ISO/IEC TR 15938-11
2005
2005
2012
MPEG-7 profile schemas

Part 12

ISO/IEC 15938-12
2008
2012

Query format

Part 13

ISO/IEC 15938-13
2015
2015

Compact descriptors for visual search


Relation between description and content[edit]




Independence between description and content


An MPEG-7 architecture requirement is that description must be separate from the audiovisual content.


On the other hand, there must be a relation between the content and description. Thus the description is multiplexed with the content itself.


On the right side you can see this relation between description and content.



MPEG-7 tools[edit]




Relation between different tools and elaboration process of MPEG-7


MPEG-7 uses the following tools:



  • Descriptor (D): It is a representation of a feature defined syntactically and semantically. It could be that a unique object was described by several descriptors.


  • Description Schemes (DS): Specify the structure and semantics of the relations between its components, these components can be descriptors (D) or description schemes (DS).


  • Description Definition Language (DDL): It is based on XML language used to define the structural relations between descriptors. It allows the creation and modification of description schemes and also the creation of new descriptors (D).


  • System tools: These tools deal with binarization, synchronization, transport and storage of descriptors. It also deals with Intellectual Property protection.

On the right side you can see the relation between MPEG-7 tools.



MPEG-7 applications[edit]


There are many applications and application domains which will benefit from the MPEG-7 standard. A few application examples are:



  • Digital library: Image/video catalogue, musical dictionary.

  • Multimedia directory services: e.g. yellow pages.

  • Broadcast media selection: Radio channel, TV channel.

  • Multimedia editing: Personalized electronic news service, media authoring.

  • Security services: Traffic control, production chains...

  • E-business: Searching process of products.

  • Cultural services: Art-galleries, museums...

  • Educational applications.

  • Biomedical applications.


  • Intelligent multimedia applications that leverage low-level multimedia semantics via formal representation and automated reasoning.[8]


Software and demonstrators for MPEG-7[edit]



  • Caliph & Emir: Annotation and retrieval of images based on MPEG-7 (GPL). Creates MPEG-7 XML files.[9]


  • C# Implementation: Open Source implementation of the MPEG-7 descriptors in C#.


  • Frameline 47 Video Notation: Frameline 47 from Versatile Delivery Systems. The first commercial MPEG-7 application, Frameline 47 uses an advanced content schema based on MPEG-7 so as to be able to notate entire video files, or segments and groups of segments from within that video file according to the MPEG-7 convention (commercial tool)


  • Eptascape ADS200 uses a real-time MPEG 7 encoder on an analog camera video signal to identify interesting events, especially in surveillance applications, check the demos to see MPEG-7 in action (commercial tool)


  • IBM VideoAnnEx Annotation Tool: Creating MPEG-7 documents for video streams describing structure and giving keywords from a controlled vocabulary (binary release, restrictive license)


  • iFinder Medienanalyse- und Retrievalsystem: Metadata extraction and search engine based on MPEG-7 (commercial tool)


  • MPEG-7 Audio Encoder: Creating MPEG-7 documents for audio documents describing low level audio characteristics (binary & source release, Java, GPL)


  • MPEG-7 Visual Descriptor Extraction: Software to extract MPEG-7 visual descriptors from images and image regions.


  • XM Feature Extraction Web Service: The functionalities of the eXperimentation Model(XM) are made available via web service interface to enable automatic MPEG-7 low-level visual description characterization of images.


  • TU Berlin MPEG-7 Audio Analyzer (Web-Demo): Creating MPEG-7 documents (XML) for audio documents (WAV, MP3). All 17 MPEG-7 low level audio descriptors are implemented (commercial)


  • TU Berlin MPEG-7 Spoken Content Demonstrator (Web-Demo): Creating MPEG-7 documents (XML) with SpokenContent description from an input speech signal (WAV, MP3) (commercial)


  • MP7JRS C++ Library Complete MPEG-7 implementation of part 3, 4 and 5 (visual, audio and MDS) by Joanneum Research Institute for Information and Communication Technologies - Audiovisual Media Group.


  • BilVideo-7: MPEG-7 compatible, distributed video indexing and retrieval system, supporting complex, multimodal, composite queries; developed by Bilkent University Multimedia Database Group (BILMDG).


  • UniSay: Sophisticated Post-production file analysis and audio processing based on MPEG-7.


See also[edit]


  • Exif

  • ID3

  • Metadata standards


  • MPEG-4 Part 11 – Scene description and application engine

  • Multimedia information retrieval

  • Query by humming


Limitations[edit]


The MPEG-7 standard was originally written in XML Schema (XSD), which constitutes semi-structured data. For example, the running time of a movie annotated using MPEG-7 in XML is machine-readable data, so software agents will know that the number expressing the running time is a positive integer, but such data is not machine-interpretable (cannot be understood by agents), because it does not convey semantics (meaning), known as the “Semantic Gap.” To address this issue, there were many attempts to map the MPEG-7 XML Schema to the Web Ontology Language (OWL), which is a structured data equivalent of the terms of the MPEG-7 standard (MPEG-7Ontos, COMM, SWIntO, etc.). However, these mappings did not really bridge the “Semantic Gap,” because low-level video features alone are inadequate for representing video semantics.[10] In other words, annotating an automatically extracted video feature, such as color distribution, does not provide the meaning of the actual visual content.[11]



Compare[edit]



  • Material Exchange Format (MXF), a container format for professional digital video and audio media defined by SMPTE.


References[edit]


  • B.S. Manjunath (Editor), Philippe Salembier (Editor), and Thomas Sikora (Editor): Introduction to MPEG-7: Multimedia Content Description Interface. Wiley & Sons, April 2002 - .mw-parser-output cite.citationfont-style:inherit.mw-parser-output .citation qquotes:"""""""'""'".mw-parser-output .citation .cs1-lock-free abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/6/65/Lock-green.svg/9px-Lock-green.svg.png")no-repeat;background-position:right .1em center.mw-parser-output .citation .cs1-lock-limited a,.mw-parser-output .citation .cs1-lock-registration abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/d/d6/Lock-gray-alt-2.svg/9px-Lock-gray-alt-2.svg.png")no-repeat;background-position:right .1em center.mw-parser-output .citation .cs1-lock-subscription abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/a/aa/Lock-red-alt-2.svg/9px-Lock-red-alt-2.svg.png")no-repeat;background-position:right .1em center.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registrationcolor:#555.mw-parser-output .cs1-subscription span,.mw-parser-output .cs1-registration spanborder-bottom:1px dotted;cursor:help.mw-parser-output .cs1-ws-icon abackground:url("//upload.wikimedia.org/wikipedia/commons/thumb/4/4c/Wikisource-logo.svg/12px-Wikisource-logo.svg.png")no-repeat;background-position:right .1em center.mw-parser-output code.cs1-codecolor:inherit;background:inherit;border:inherit;padding:inherit.mw-parser-output .cs1-hidden-errordisplay:none;font-size:100%.mw-parser-output .cs1-visible-errorfont-size:100%.mw-parser-output .cs1-maintdisplay:none;color:#33aa33;margin-left:0.3em.mw-parser-output .cs1-subscription,.mw-parser-output .cs1-registration,.mw-parser-output .cs1-formatfont-size:95%.mw-parser-output .cs1-kern-left,.mw-parser-output .cs1-kern-wl-leftpadding-left:0.2em.mw-parser-output .cs1-kern-right,.mw-parser-output .cs1-kern-wl-rightpadding-right:0.2em
    ISBN 0-471-48678-7

  • Harald Kosch: Distributed Multimedia Database Technologies Supported by MPEG-7 and MPEG-21. CRC Press, January 2004 -
    ISBN 0-8493-1854-8

  • Giorgos Stamou (Editor) and Stefanos Kollias (Editor): Multimedia Content and the Semantic Web: Standards, Methods and Tools. Wiley & Sons, May 2005 -
    ISBN 0-470-85753-6

  • Hyoung-Gook Kim, Nicolas Moreau, and Thomas Sikora: MPEG-7 Audio and Beyond: Audio Content Indexing and Retrieval. Wiley & Sons, October 2005 -
    ISBN 0-470-09334-X



  1. ^ ISO. "ISO/IEC 15938-1:2002 - Information technology -- Multimedia content description interface -- Part 1: Systems". Retrieved 2009-10-31.


  2. ^ MPEG. "About MPEG - Achievements". chiariglione.org. Archived from the original on July 8, 2008. Retrieved 2009-10-31.


  3. ^ MPEG. "Terms of Reference". chiariglione.org. Archived from the original on February 21, 2010. Retrieved 2009-10-31.


  4. ^ ab MPEG. "MPEG standards - Full list of standards developed or under development". chiariglione.org. Archived from the original on April 20, 2010. Retrieved 2009-10-31.


  5. ^ NetworkDictionary. "Complete Protocol dictionary, glossary and reference - M". Retrieved 2009-12-26.


  6. ^ ISO/IEC JTC 1/SC 29 (2009-10-30). "MPEG-7 (Multimedia content description interface)". Archived from the original on 2013-12-31. Retrieved 2009-11-10.


  7. ^ ISO/IEC JTC1/SC29/WG11 (October 2004). "MPEG-7 Overview (version 10)". chiariglione.org. Retrieved 2009-11-01.


  8. ^
    "MPEG-7 Ontology". Retrieved 29 June 2017.



  9. ^ Lux, Mathias. "Caliph & Emir: MPEG-7 photo annotation and retrieval." Proceedings of the 17th ACM international conference on Multimedia. ACM, 2009.


  10. ^ Sikos, Leslie F.; Powers, David M.W. (2015). "Knowledge-Driven Video Information Retrieval with LOD": 35–37. doi:10.1145/2810133.2810141.


  11. ^ Boll, Susanne; Klas, Wolfgang; Sheth, Amit (1998). "Overview on Using Metadata to Manage Multimedia Data". Using Metadata to Integrate and Apply Digital Media. McGraw-Hill. p. 3. ISBN 978-0070577350.




External links[edit]


  • MPEG-7 Overview

  • MPEG-7/-21 Community Portal











Retrieved from "https://en.wikipedia.org/w/index.php?title=MPEG-7&oldid=864591437"





Navigation menu

























(window.RLQ=window.RLQ||).push(function()mw.config.set("wgPageParseReport":"limitreport":"cputime":"0.488","walltime":"0.594","ppvisitednodes":"value":1644,"limit":1000000,"ppgeneratednodes":"value":0,"limit":1500000,"postexpandincludesize":"value":121394,"limit":2097152,"templateargumentsize":"value":1822,"limit":2097152,"expansiondepth":"value":12,"limit":40,"expensivefunctioncount":"value":1,"limit":500,"unstrip-depth":"value":1,"limit":20,"unstrip-size":"value":37213,"limit":5000000,"entityaccesscount":"value":1,"limit":400,"timingprofile":["100.00% 350.162 1 -total"," 43.96% 153.922 4 Template:ISBN"," 41.40% 144.966 1 Template:Reflist"," 31.51% 110.329 4 Template:Catalog_lookup_link"," 21.13% 73.994 8 Template:Cite_web"," 13.39% 46.882 7 Template:Navbox"," 10.82% 37.892 1 Template:Cite_journal"," 6.66% 23.337 1 Template:Compression_formats"," 5.51% 19.291 4 Template:Error-small"," 4.98% 17.450 1 Template:MPEG"],"scribunto":"limitreport-timeusage":"value":"0.124","limit":"10.000","limitreport-memusage":"value":3654676,"limit":52428800,"cachereport":"origin":"mw1245","timestamp":"20190304012343","ttl":2592000,"transientcontent":false);mw.config.set("wgBackendResponseTime":98,"wgHostname":"mw1242"););

Popular posts from this blog

𛂒𛀶,𛀽𛀑𛂀𛃧𛂓𛀙𛃆𛃑𛃷𛂟𛁡𛀢𛀟𛁤𛂽𛁕𛁪𛂟𛂯,𛁞𛂧𛀴𛁄𛁠𛁼𛂿𛀤 𛂘,𛁺𛂾𛃭𛃭𛃵𛀺,𛂣𛃍𛂖𛃶 𛀸𛃀𛂖𛁶𛁏𛁚 𛂢𛂞 𛁰𛂆𛀔,𛁸𛀽𛁓𛃋𛂇𛃧𛀧𛃣𛂐𛃇,𛂂𛃻𛃲𛁬𛃞𛀧𛃃𛀅 𛂭𛁠𛁡𛃇𛀷𛃓𛁥,𛁙𛁘𛁞𛃸𛁸𛃣𛁜,𛂛,𛃿,𛁯𛂘𛂌𛃛𛁱𛃌𛂈𛂇 𛁊𛃲,𛀕𛃴𛀜 𛀶𛂆𛀶𛃟𛂉𛀣,𛂐𛁞𛁾 𛁷𛂑𛁳𛂯𛀬𛃅,𛃶𛁼

Edmonton

Crossroads (UK TV series)