Skip to content

Commit

Permalink
Merge pull request #1588 from jsoref/spelling
Browse files Browse the repository at this point in the history
Fix spelling mistakes
  • Loading branch information
gonzalo-bulnes authored Nov 22, 2022
2 parents 6fefb2a + 6ae29a3 commit ddd0ebd
Show file tree
Hide file tree
Showing 28 changed files with 71 additions and 71 deletions.
2 changes: 1 addition & 1 deletion Doxyfile
Original file line number Diff line number Diff line change
Expand Up @@ -753,7 +753,7 @@ SHOW_NAMESPACES = YES
# The FILE_VERSION_FILTER tag can be used to specify a program or script that
# doxygen should invoke to get the current version for each file (typically from
# the version control system). Doxygen will invoke the program by executing (via
# popen()) the command command input-file, where command is the value of the
# popen()) the command input-file, where command is the value of the
# FILE_VERSION_FILTER tag, and input-file is the name of an input file provided
# by doxygen. Whatever the program writes to standard output is used as the file
# version. For an example see the documentation.
Expand Down
4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -164,10 +164,10 @@ requirements: ## Update *requirements.txt files if pinned versions do not compl
pip-compile --generate-hashes --output-file requirements/requirements.txt requirements/requirements.in
$(MAKE) dev-requirements

# Explaination of the below shell command should it ever break.
# Explanation of the below shell command should it ever break.
# 1. Set the field separator to ": ##" and any make targets that might appear between : and ##
# 2. Use sed-like syntax to remove the make targets
# 3. Format the split fields into $$1) the target name (in blue) and $$2) the target descrption
# 3. Format the split fields into $$1) the target name (in blue) and $$2) the target description
# 4. Pass this file as an arg to awk
# 5. Sort it alphabetically
# 6. Format columns with colon as delimiter.
Expand Down
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ To learn more about architecture and our rationale behind our Qubes OS approach,

The quickest way to get started with running the client is to use the [developer environment](#developer-environment) that [runs against a test server running in a local docker container](#running-against-a-test-server). This differs from a staging or production environment where the client receives and sends requests over Tor. Things are a lot snappier in the developer environment and can sometimes lead to a much different user experience, which is why it is important to do end-to-end testing in Qubes using the [staging environment](#staging-environment), especially if you are modifying code paths involving how we handle server requests and responses.

For reproducing production bugs or running demos, we recommend using the [Production Environment](#production-envrionment) that will allow you to test a nightly build of the client.
For reproducing production bugs or running demos, we recommend using the [Production Environment](#production-environment) that will allow you to test a nightly build of the client.

We support running the [developer environment on a non-Qubes OS](#developer-environment-on-a-non-qubes-os) for developer convenience. If this is your preferred environment, keep in mind that you, or a PR reviewer, will need to run tests in Qubes if you modify code paths involving any of the following:

Expand Down Expand Up @@ -181,7 +181,7 @@ See [SecureDrop docs on setting up a staging server](https://docs.securedrop.org
3. Create a `config.json` file

```
cd securedrop-worksation
cd securedrop-workstation
cp config.json.example config.json
vi config.json
```
Expand Down Expand Up @@ -233,7 +233,7 @@ See [SecureDrop docs on setting up a server](https://docs.securedrop.org/en/late
3. Create a `config.json` file

```
cd securedrop-worksation
cd securedrop-workstation
cp config.json.example config.json
vi config.json
```
Expand Down Expand Up @@ -333,7 +333,7 @@ There are two packages you'll have to install manually in order to run the entir
`apt install xvfb`
`apt install sqlite3`

We launch tests via `xvfb-run` on an `xvfb` X server in order to support machines with no display hardware, like we have in CircleCI. Even when running tests on a machine with display hardware, `xvfb` is useful in that it prevents a bunch of windows and dialogs from popping up on your desktop. If you want to run tests without `xfvb` then you can just uninstall it and run thet tests and checks as described below.
We launch tests via `xvfb-run` on an `xvfb` X server in order to support machines with no display hardware, like we have in CircleCI. Even when running tests on a machine with display hardware, `xvfb` is useful in that it prevents a bunch of windows and dialogs from popping up on your desktop. If you want to run tests without `xvfb` then you can just uninstall it and run thet tests and checks as described below.

NOTE: `xvfb-run` will start and stop `xvfb` for you.

Expand Down
6 changes: 3 additions & 3 deletions changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@
* UI fix to bottom margin in conversation view (#1391)
* (Dev) Refactors that break up the widgets module into smaller components (#1377-1383, #1390, #1393, #1394)
* (Dev) Use Debian Stable container images in CI (#1385)
* (Dev) New templates for creating different types of Github Issues (#1392)
* (Dev) Add localization testing task to the release Github Issue template (#1401)
* (Dev) New templates for creating different types of GitHub Issues (#1392)
* (Dev) Add localization testing task to the release GitHub Issue template (#1401)

## 0.5.1
* Fix Python symlink, which broke package in 0.5.0
Expand Down Expand Up @@ -217,7 +217,7 @@
* No longer sync after sending a reply (#722).
* Update gui instead of sync when a file is missing (#724).
* Revert usage of subprocess.check_output text parameter (#755).
* Update obselete original_filename usage in file_ready (#773).
* Update obsolete original_filename usage in file_ready (#773).
* Don't import source keys we already have (#749).
* Make sync continuous (#739).
* Remove shadow on sign in button (#763).
Expand Down
2 changes: 1 addition & 1 deletion securedrop_client/api_jobs/downloads.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def call_download_api(
self, api: API, db_object: Union[File, Message, Reply]
) -> Tuple[str, str]:
"""
Method for making the actual API call to downlod the file and handling the result.
Method for making the actual API call to download the file and handling the result.
This MUST return the (etag, filepath) tuple response from the server and MUST raise an
exception if and only if the download fails.
Expand Down
2 changes: 1 addition & 1 deletion securedrop_client/api_jobs/sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,7 +103,7 @@ def _update_users(session: Session, remote_users: List[SDKUser]) -> None:
#
# In order to support an edge case that can occur on a pre-2.2.0 server that does not create
# a "deleted" user account, the client will create one locally when there are draft replies
# that need to be re-assoicated. Once the "deleted" user account exists on the server, it
# that need to be re-associated. Once the "deleted" user account exists on the server, it
# will replace the local one.
for uuid, account in local_users.items():
# Do not delete the local "deleted" user account if there is no "deleted" user account
Expand Down
2 changes: 1 addition & 1 deletion securedrop_client/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -234,7 +234,7 @@ class Message(Base):

content = Column(
Text,
# this check contraint ensures the state of the DB is what one would expect
# this check constraint ensures the state of the DB is what one would expect
CheckConstraint(
"CASE WHEN is_downloaded = 0 THEN content IS NULL ELSE 1 END",
name="ck_message_compare_download_vs_content",
Expand Down
6 changes: 3 additions & 3 deletions securedrop_client/gui/widgets.py
Original file line number Diff line number Diff line change
Expand Up @@ -639,7 +639,7 @@ def show_sources(self, sources: List[Source]) -> None:
else:
self.empty_conversation_view.hide()

# If the source list in the GUI is empty, then we will run the optimized intial update.
# If the source list in the GUI is empty, then we will run the optimized initial update.
# Otherwise, do a regular source list update.
if not self.source_list.source_items:
self.source_list.initial_update(sources)
Expand Down Expand Up @@ -3197,8 +3197,8 @@ def _on_sync_succeeded(self) -> None:

class ReplyTextEdit(QPlainTextEdit):
"""
A plaintext textbox with placeholder that disapears when clicked and
a richtext lable on top to replace the placeholder functionality
A plaintext textbox with placeholder that disappears when clicked and
a richtext label on top to replace the placeholder functionality
"""

def __init__(self, source: Source, controller: Controller) -> None:
Expand Down
6 changes: 3 additions & 3 deletions securedrop_client/logic.py
Original file line number Diff line number Diff line change
Expand Up @@ -432,7 +432,7 @@ def setup(self) -> None:
Setup the application with the default state of:
* Not logged in.
* Show most recent state of syncronised sources.
* Show most recent state of synchronized sources.
* Show the login screen.
* Check the sync status every 30 seconds.
"""
Expand Down Expand Up @@ -630,7 +630,7 @@ def on_sync_started(self) -> None:

def on_sync_success(self) -> None:
"""
Called when syncronisation of data via the API queue succeeds.
Called when synchronization of data via the API queue succeeds.
* Set last sync flag
* Display the last sync time and updated list of sources in GUI
Expand Down Expand Up @@ -670,7 +670,7 @@ def on_sync_success(self) -> None:

def on_sync_failure(self, result: Exception) -> None:
"""
Called when syncronisation of data via the API fails after a background sync. If the reason
Called when synchronization of data via the API fails after a background sync. If the reason
a sync fails is ApiInaccessibleError then we need to log the user out for security reasons
and show them the login window in order to get a new token.
"""
Expand Down
8 changes: 4 additions & 4 deletions securedrop_client/queue.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ class RunnableQueue(QObject):
ApiInaccessibleError before it makes an api call, which will repeat this process.
Any other exception encountered while processing a job is unexpected, so the queue will drop the
job and continue on to processing the next job. The job itself is responsible for emiting the
job and continue on to processing the next job. The job itself is responsible for emitting the
success and failure signals, so when an unexpected error occurs, it should emit the failure
signal so that the Controller can respond accordingly.
"""
Expand Down Expand Up @@ -75,9 +75,9 @@ def __init__(self, api_client: API, session_maker: scoped_session) -> None:
self.api_client = api_client
self.session_maker = session_maker
self.queue = PriorityQueue() # type: PriorityQueue[Tuple[int, QueueJob]]
# `order_number` ensures jobs with equal priority are retrived in FIFO order. This is needed
# because PriorityQueue is implemented using heapq which does not have sort stability. For
# more info, see : https://bugs.python.org/issue17794
# `order_number` ensures jobs with equal priority are retrieved in FIFO order. This is
# needed because PriorityQueue is implemented using heapq which does not have sort
# stability. For more info, see : https://bugs.python.org/issue17794
self.order_number = itertools.count()
self.current_job = None # type: Optional[QueueJob]

Expand Down
2 changes: 1 addition & 1 deletion securedrop_client/state/state.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ def selected_conversation(self, id: Optional[ConversationId]) -> None:

@property
def selected_conversation_has_downloadable_files(self) -> bool:
"""Whether the selected conversation has any files that are not alredy downloaded"""
"""Whether the selected conversation has any files that are not already downloaded"""
selected_conversation_id = self._selected_conversation
if selected_conversation_id is None:
return False
Expand Down
4 changes: 2 additions & 2 deletions securedrop_client/storage.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def update_local_storage(

# Remove source UUIDs from DeletedConversation table and/or the DeletedSource table.
# Records enter these tables when a user deletes data locally and the
# data is succesfully scheduled for deletion on the server. In order to guard
# data is successfully scheduled for deletion on the server. In order to guard
# against locally-deleted records being re-added to the database (even for a few seconds)
# during a stale sync, we flag them in these tables ("Deleting files and messages"
# corresponds to the DeletedConversation table, and deleting a source corresponds
Expand All @@ -246,7 +246,7 @@ def _get_flagged_locally_deleted(
) -> Tuple[List[DeletedConversation], List[DeletedSource]]:
"""
Helper function that returns two lists of source UUIDs, corresponding to
locally-deleted conversations and sources, respsectively.
locally-deleted conversations and sources, respectively.
The first sync after a conversation or source is deleted locally, we avoid updating it, in
order to avoid potentially re-downloading deleted data in a network race.
Expand Down
2 changes: 1 addition & 1 deletion securedrop_client/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def safe_mkdir(
#
# Note: We do not use parents=True because the parent directories will not be created with the
# specified mode. Parents are created using system default permissions, which we modify to be
# 700 via os.umask in the Window (QMainWindow) contructor. Creating directories one-by-one with
# 700 via os.umask in the Window (QMainWindow) constructor. Creating directories one-by-one with
# mode=0o0700 is not necessary but adds defense in depth.
relative_path = relative_filepath(full_path, base_path)
for parent in reversed(relative_path.parents):
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
long_description=long_description,
long_description_content_type="text/markdown",
license="AGPLv3+",
install_requires=["SQLALchemy", "alembic", "securedrop-sdk", "python-dateutil", "arrow"],
install_requires=["SQLAlchemy", "alembic", "securedrop-sdk", "python-dateutil", "arrow"],
python_requires=">=3.5",
url="https://github.com/freedomofpress/securedrop-proxy",
packages=["securedrop_client", "securedrop_client.gui", "securedrop_client.resources"],
Expand Down
4 changes: 2 additions & 2 deletions tests/api_jobs/test_base.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ def test_ApiInaccessibleError_init():
assert str(err) == msg


def test_ApiJob_raises_NotImplemetedError():
def test_ApiJob_raises_NotImplementedError():
job = ApiJob()

with pytest.raises(NotImplementedError):
Expand Down Expand Up @@ -101,7 +101,7 @@ def test_ApiJob_other_error(mocker):


@pytest.mark.parametrize("exception", [RequestTimeoutError, ServerConnectionError])
def test_ApiJob_retry_suceeds_after_failed_attempt(mocker, exception):
def test_ApiJob_retry_succeeds_after_failed_attempt(mocker, exception):
"""Retry logic: after failed attempt should succeed"""

number_of_attempts = 5
Expand Down
16 changes: 8 additions & 8 deletions tests/api_jobs/test_sync.py
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ def test_MetadataSyncJob_has_default_timeout(mocker, homedir, session, session_m
assert api_client.default_request_timeout == job.DEFAULT_REQUEST_TIMEOUT


def test_MetadataSyncJob_takes_overriden_timeout(mocker, homedir, session, session_maker):
def test_MetadataSyncJob_takes_overridden_timeout(mocker, homedir, session, session_maker):
api_client = mocker.patch("securedrop_client.logic.sdclientapi.API")
remote_user = factory.RemoteUser()
api_client.get_users = mocker.MagicMock(return_value=[remote_user])
Expand Down Expand Up @@ -101,8 +101,8 @@ def test_MetadataSyncJob_updates_application_state(mocker, homedir, session, ses
app_state = state.State()
state_updater = mocker.patch("securedrop_client.api_jobs.sync._update_state")

exising_user = factory.User(uuid="abc123-ima-uuid")
session.add(exising_user)
existing_user = factory.User(uuid="abc123-ima-uuid")
session.add(existing_user)

job = MetadataSyncJob(homedir, app_state)
job.call_api(api_client, session)
Expand All @@ -120,15 +120,15 @@ def test_MetadataSyncJob_updates_existing_user(mocker, homedir, session, session
)
api_client.get_users = mocker.MagicMock(return_value=[remote_user])

exising_user = factory.User(uuid="abc123-ima-uuid")
session.add(exising_user)
existing_user = factory.User(uuid="abc123-ima-uuid")
session.add(existing_user)

job = MetadataSyncJob(homedir)
job.call_api(api_client, session)

assert exising_user.username == "new-username"
assert exising_user.firstname == "NewFirstName"
assert exising_user.lastname == "NewLastName"
assert existing_user.username == "new-username"
assert existing_user.firstname == "NewFirstName"
assert existing_user.lastname == "NewLastName"


def test_MetadataSyncJob_deletes_user(mocker, homedir, session, session_maker):
Expand Down
2 changes: 1 addition & 1 deletion tests/api_jobs/test_updatestar.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def test_unstar_if_star(homedir, mocker, session, session_maker):

job.call_api(api_client, session)

# ensure we call remove start wtih right source uuid
# ensure we call remove start with right source uuid
mock_source_init.assert_called_once_with(uuid=source.uuid)
api_client.remove_star.assert_called_once_with(mock_sdk_source)

Expand Down
2 changes: 1 addition & 1 deletion tests/api_jobs/test_uploads.py
Original file line number Diff line number Diff line change
Expand Up @@ -255,7 +255,7 @@ def test_send_reply_sql_exception_during_failure(
session.add(source)

# Note that we do not add a DraftReply. An exception will occur when we try
# to set the reply status to 'FAILED' for a non-existent reply, which we
# to set the reply status to 'FAILED' for a nonexistent reply, which we
# expect to be handled.

gpg = GpgHelper(homedir, session_maker, is_qubes=False)
Expand Down
2 changes: 1 addition & 1 deletion tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ def export_service():
# Ensure the export_service doesn't rely on Qubes OS:
export_service._run_disk_test = lambda dir: None
export_service._run_usb_test = lambda dir: None
export_service._run_disk_export = lambda dir, paths, pasphrase: None
export_service._run_disk_export = lambda dir, paths, passphrase: None
export_service._run_printer_preflight = lambda dir: None
export_service._run_print = lambda dir, paths: None
return export_service
Expand Down
4 changes: 2 additions & 2 deletions tests/gui/conversation/export/test_device.py
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ def test_Device_run_export_preflight_checks(homedir, mocker, source):

def test_Device_export_file_to_usb_drive(homedir, mocker):
"""
The signal `export_requested` should be emmited during export_file_to_usb_drive.
The signal `export_requested` should be emitted during export_file_to_usb_drive.
"""
gui = mocker.MagicMock(spec=Window)
with threads(3) as [sync_thread, main_queue_thread, file_download_queue_thread]:
Expand Down Expand Up @@ -219,7 +219,7 @@ def test_Device_export_file_to_usb_drive_when_orig_file_already_exists(
homedir, config, mocker, source
):
"""
The signal `export_requested` should still be emmited if the original file already exists.
The signal `export_requested` should still be emitted if the original file already exists.
"""
gui = mocker.MagicMock(spec=Window)
with threads(3) as [sync_thread, main_queue_thread, file_download_queue_thread]:
Expand Down
10 changes: 5 additions & 5 deletions tests/gui/test_actions.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ def _dialog_constructor(source: Source) -> QDialog:
)

def test_deletes_conversation_when_dialog_accepted(self):
# Accept the confimation dialog from a separate thread.
# Accept the confirmation dialog from a separate thread.
QTimer.singleShot(10, self._dialog.accept)

self.action.trigger()
Expand All @@ -46,7 +46,7 @@ def test_deletes_conversation_when_dialog_accepted(self):
)

def test_does_not_delete_conversation_when_dialog_rejected(self):
# Reject the confimation dialog from a separate thread.
# Reject the confirmation dialog from a separate thread.
QTimer.singleShot(10, self._dialog.reject)

self.action.trigger()
Expand All @@ -70,7 +70,7 @@ def test_requires_authenticated_journalist(self):
def test_deletes_nothing_if_no_conversation_is_selected(self):
self._app_state.selected_conversation = None

# Accept the confimation dialog from a separate thread.
# Accept the confirmation dialog from a separate thread.
QTimer.singleShot(10, self._dialog.accept)

self.action.trigger()
Expand All @@ -92,15 +92,15 @@ def _dialog_constructor(source: Source) -> QDialog:
self.action = DeleteSourceAction(self._source, _menu, self._controller, _dialog_constructor)

def test_deletes_source_when_dialog_accepted(self):
# Accept the confimation dialog from a separate thread.
# Accept the confirmation dialog from a separate thread.
QTimer.singleShot(10, self._dialog.accept)

self.action.trigger()

self._controller.delete_source.assert_called_once_with(self._source)

def test_does_not_delete_source_when_dialog_rejected(self):
# Reject the confimation dialog from a separate thread.
# Reject the confirmation dialog from a separate thread.
QTimer.singleShot(10, self._dialog.reject)

self.action.trigger()
Expand Down
Loading

0 comments on commit ddd0ebd

Please sign in to comment.