Skip to content

Commit

Permalink
Merge pull request #1730 from dadoonet/sphinx-doc
Browse files Browse the repository at this point in the history
Add the requirements.in file
  • Loading branch information
dadoonet authored Oct 17, 2023
2 parents 5cc4672 + dbd8f50 commit 9f078fc
Show file tree
Hide file tree
Showing 10 changed files with 80 additions and 24 deletions.
3 changes: 3 additions & 0 deletions docs/requirements.in
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
# pip-compile requirements.in
Sphinx>=5,<6
sphinx_rtd_theme
64 changes: 61 additions & 3 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,62 @@
# Defining the exact version will make sure things don't break
#
# This file is autogenerated by pip-compile with Python 3.7
# by the following command:
#
# pip-compile requirements.in
#
alabaster==0.7.13
# via sphinx
babel==2.13.0
# via sphinx
certifi==2023.7.22
# via requests
charset-normalizer==3.3.0
# via requests
docutils==0.17.1
# via
# sphinx
# sphinx-rtd-theme
idna==3.4
# via requests
imagesize==1.4.1
# via sphinx
importlib-metadata==6.7.0
# via sphinx
jinja2==3.1.2
# via sphinx
markupsafe==2.1.3
# via jinja2
packaging==23.2
# via sphinx
pygments==2.16.1
# via sphinx
pytz==2023.3.post1
# via babel
requests==2.31.0
# via sphinx
snowballstemmer==2.2.0
# via sphinx
sphinx==5.3.0
sphinx_rtd_theme==1.1.1
readthedocs-sphinx-search==0.1.1
# via
# -r requirements.in
# sphinx-rtd-theme
sphinx-rtd-theme==1.1.1
# via -r requirements.in
sphinxcontrib-applehelp==1.0.2
# via sphinx
sphinxcontrib-devhelp==1.0.2
# via sphinx
sphinxcontrib-htmlhelp==2.0.0
# via sphinx
sphinxcontrib-jsmath==1.0.1
# via sphinx
sphinxcontrib-qthelp==1.0.3
# via sphinx
sphinxcontrib-serializinghtml==1.1.5
# via sphinx
typing-extensions==4.7.1
# via importlib-metadata
urllib3==2.0.6
# via requests
zipp==3.15.0
# via importlib-metadata
2 changes: 1 addition & 1 deletion docs/source/admin/fs/elasticsearch.rst
Original file line number Diff line number Diff line change
Expand Up @@ -510,7 +510,7 @@ Path prefix
^^^^^^^^^^^

.. versionadded:: 2.7 If your elasticsearch is running behind a proxy with url rewriting,
you might have to specify a path prefix. This can be done with ``path_prefix`` setting:
you might have to specify a path prefix. This can be done with ``path_prefix`` setting:

.. code:: yaml
Expand Down
2 changes: 1 addition & 1 deletion docs/source/admin/fs/local-fs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,7 @@ to the index, set option `Add as Inner Object`_
which stores additional metadata and the XML contents under field
``object``.

.. code:: json
.. code:: yaml
name: "test"
fs:
Expand Down
14 changes: 2 additions & 12 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@

# development versions always have the suffix '-SNAPSHOT'
def read_version(full_version=True):
raw_version = config.get('FsCrawler', 'Version');
raw_version = config.get('FsCrawler', 'Version')
return raw_version if full_version else raw_version.replace("-SNAPSHOT", "")

version = read_version(full_version=False)
Expand Down Expand Up @@ -89,7 +89,7 @@ def read_version(full_version=True):
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
#language = None

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
Expand Down Expand Up @@ -207,16 +207,6 @@ def read_version(full_version=True):
epub_exclude_files = ['search.html']


# -- Extension configuration -------------------------------------------------

from recommonmark.parser import CommonMarkParser

source_parsers = {
'.md': CommonMarkParser,
}

source_suffix = ['.rst', '.md']

# -- Options for todo extension ----------------------------------------------

# If true, `todo` and `todoList` produce output, else they produce nothing.
Expand Down
7 changes: 6 additions & 1 deletion docs/source/dev/doc.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ to have Python3 installed.

Assuming you have `Python3 <https://www.python.org/>`_ already, install `Sphinx <http://www.sphinx-doc.org/>`_::

$ pip install sphinx sphinx-autobuild sphinx_rtd_theme recommonmark
$ pip install -r docs/requirements.txt

Go to the ``docs`` directory and build the html documentation::

Expand All @@ -25,3 +25,8 @@ Then just edit the documentation and look for your changes at http://127.0.0.1:8

To learn more about the reStructuredText format, please look at the
`basic guide <http://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html>`_.

To update the requirements file if you changed the ``requirements.in`` file, run::

$ cd docs
$ pip-compile requirements.in
2 changes: 1 addition & 1 deletion docs/source/user/getting_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ something! ;-)

.. code:: json
GET docs/doc/_search
// GET docs/doc/_search
{
"query" : {
"query_string": {
Expand Down
2 changes: 1 addition & 1 deletion docs/source/user/ocr.rst
Original file line number Diff line number Diff line change
Expand Up @@ -159,4 +159,4 @@ Supported strategies are:
* ``ocr_and_text``: OCR and text extraction is performed.

.. note:: When omitted, ``ocr_and_text`` value is used. If you have performance issues, it's worth using the ``auto`` option
instead as only documents with barely no text will go through the OCR process.
instead as only documents with barely no text will go through the OCR process.
4 changes: 2 additions & 2 deletions docs/source/user/tips.rst
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ Using docker with FSCrawler REST

To use the REST service available from 2.2 you can add the ``--rest`` flag to the FSCrawler docker container ``command:``. Note that you must expose the same ports that the REST service opens on in the docker container. For example, if your REST service starts on ``127.0.0.1:8080`` then expose the same ports in your FSCrawler docker-compose image:

.. code:: yml
.. code:: yaml
fscrawler:
context: ${PWD}
Expand All @@ -67,7 +67,7 @@ To use the REST service available from 2.2 you can add the ``--rest`` flag to th
Then expose the docker container you've created by changing the IP of the REST URL in your ``settings.yaml`` to the docker-compose container name:

.. code:: yml
.. code:: yaml
rest :
url: "http://fscrawler:8080"
Expand Down
4 changes: 2 additions & 2 deletions docs/source/user/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Install Elastic stack
^^^^^^^^^^^^^^^^^^^^^

* Download `Elasticsearch <https://www.elastic.co/downloads/elasticsearch>`_
* Download `Kibana <https://www.elastic.co/downloads/kibana>`_
* Download `Kibana <https://www.elastic.co/downloads/kibana>`__
* Start Elasticsearch server
* Start Kibana server
* Check that Kibana is running by opening http://localhost:5601
Expand Down Expand Up @@ -86,7 +86,7 @@ FSCrawler should index all the documents inside your directory.
Create Index pattern
^^^^^^^^^^^^^^^^^^^^

* Open `Kibana <http://localhost:5601>`_
* Open `Kibana <http://localhost:5601>`__
* Go to the `Management <http://0.0.0.0:5601/app/kibana#/management/>`_ page
* Open the `Index Patterns <http://0.0.0.0:5601/app/kibana#/management/kibana/index_patterns?_g=()>`_ page
under Kibana settings.
Expand Down

0 comments on commit 9f078fc

Please sign in to comment.