|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |
java.lang.Objectorg.semanticdesktop.aperture.subcrawler.base.AbstractSubCrawler
org.semanticdesktop.aperture.subcrawler.base.AbstractArchiverSubCrawler
public abstract class AbstractArchiverSubCrawler
A SubCrawler Implementation working with archive files, i.e. files containing a number of other files. This tries to be an abstraction over all known archive systems (zip, tar etc.)
Nested Class Summary | |
---|---|
protected static class |
AbstractArchiverSubCrawler.ArchiveEntry
Encapsulates an archive entry |
protected static class |
AbstractArchiverSubCrawler.ArchiveInputStream
An input stream encapsulating an archive stream with compressed data |
Field Summary | |
---|---|
static int |
MAX_ZIP_BOMB_REPEAT_COUNT
The maximal number of times a path may repeat in the uri of a resource to consider it a zip-bomb of the kind similar to the well-known droste.zip. |
Constructor Summary | |
---|---|
AbstractArchiverSubCrawler()
|
Method Summary | |
---|---|
protected abstract AbstractArchiverSubCrawler.ArchiveInputStream |
getArchiveInputStream(InputStream compressedStream)
|
DataObject |
getDataObject(URI parentUri,
String path,
InputStream stream,
DataSource dataSource,
Charset charset,
String mimeType,
RDFContainerFactory factory)
Get a DataObject from the specified stream with the given path. |
void |
stopSubCrawler()
Stops a running crawl as fast as possible. |
void |
subCrawl(URI id,
InputStream stream,
SubCrawlerHandler handler,
DataSource dataSource,
AccessData accessData,
Charset charset,
String mimeType,
RDFContainer parentMetadata)
Starts crawling the given stream and to report the encountered DataObjects to the given SubCrawlerHandler. |
Methods inherited from class org.semanticdesktop.aperture.subcrawler.base.AbstractSubCrawler |
---|
createChildUri, getUriPrefix |
Methods inherited from class java.lang.Object |
---|
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Field Detail |
---|
public static final int MAX_ZIP_BOMB_REPEAT_COUNT
Constructor Detail |
---|
public AbstractArchiverSubCrawler()
Method Detail |
---|
protected abstract AbstractArchiverSubCrawler.ArchiveInputStream getArchiveInputStream(InputStream compressedStream)
compressedStream
- the stream with the compressed archive data
public void subCrawl(URI id, InputStream stream, SubCrawlerHandler handler, DataSource dataSource, AccessData accessData, Charset charset, String mimeType, RDFContainer parentMetadata) throws SubCrawlerException
SubCrawler
id
- the URI identifying the object (e.g. a file or web page) from which the stream was obtained.
This URI is treated as the URI of the parent object, all objects encountered in the stream
are considered to be contained within the parent object. (optional, the implementation may
use this uri or the one returned from the RDFContainer.getDescribedUri()
method of
the parentMetadata)stream
- the stream to be crawled. (obligatory)handler
- The crawler handler that is to receive the notifications from the SubCrawler
(obligatory)dataSource
- the data source that will be returned by the DataObject.getDataSource()
method of the returned data objects. Some implementations may require that this reference is
not null and that it contains some particular informationaccessData
- the AccessData used to determine if the encountered objects are to be returned as
new, modified, unmodified or deleted. Information about new or modified objects is stored
within for use in future crawls. This parameter may be null if this functionality is not
desired, in which case all DataObjects will be reported as new. (optional)charset
- the charset in which the input stream is encoded (optional).mimeType
- the MIME type of the passed stream (optional).parentMetadata
- The 'parent' RDFContainer, that will contain the metadata about the top-level
entity in the stream. A SubCrawler may (in some cases) limit itself to augmenting the
metadata in this RDFContainer without delivering any additional DataObjects. (obligatory)
SubCrawlerException
- if any of the obligatory parameters is null or if any error during the
crawling process occuredSubCrawler.subCrawl(URI, InputStream, SubCrawlerHandler, DataSource, AccessData, Charset, String, RDFContainer)
public DataObject getDataObject(URI parentUri, String path, InputStream stream, DataSource dataSource, Charset charset, String mimeType, RDFContainerFactory factory) throws SubCrawlerException, PathNotFoundException
SubCrawler
getDataObject
in interface SubCrawler
getDataObject
in class AbstractSubCrawler
parentUri
- the URI of the parent object where the path will be looked forpath
- the path of the requested resourcestream
- the stream that contains the resourcedataSource
- data source that will be returned by the DataObject.getDataSource()
method of
the returned data object. Some implementations may require that this reference is not null
and that it contains some particular informationcharset
- the charset in which the input stream is encoded (optional).mimeType
- the MIME type of the passed stream (optional).factory
- An RDFContainerFactory that delivers the RDFContainer to which the metadata of the
DataObject should be added. The provided RDFContainer can later be retrieved as the
DataObject's metadata container.
SubCrawlerException
- if any I/O error occurs
PathNotFoundException
- if the requested path is not foundpublic void stopSubCrawler()
SubCrawler
|
||||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | |||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |