Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support correctly long path or filename #24224

Open
florian-prd opened this issue Nov 19, 2020 · 31 comments
Open

Support correctly long path or filename #24224

florian-prd opened this issue Nov 19, 2020 · 31 comments
Labels
0. Needs triage Pending check for reproducibility or if it fits our roadmap bug feature: dav hotspot: filename handling Filenames - invalid, portable, blacklisting, etc.

Comments

@florian-prd
Copy link

florian-prd commented Nov 19, 2020

Steps to reproduce

  1. Upload a file (desktop client) with a path of length 150 (for example Bibliography/VacuumLeaks/xxx where xxx is of length 138)

Expected behaviour

The file should be correctly uploaded

Actual behaviour

The GUI complains with the following error:
2020-11-03 14:23:31:820 [ warning nextcloud.gui.activity ]: Item “Bibliography/VacuumLeaks/xxxx” retrieved resulted in “File name too long”

Server configuration

Operating system:
Ubuntu 20.04 (also tested with 18.04)
Web server:
nginx version: nginx/1.18.0 (Ubuntu)
Database:
mysqld 10.3.25-MariaDB-0ubuntu0.20.04.1
PHP version:
7.4 (also 7.2 tested)
Nextcloud version: (see Nextcloud admin page)
19.0.4
Updated from an older Nextcloud/ownCloud or fresh install:
Updated
Where did you install Nextcloud from:
Website of Nextcloud
Signing status:

Signing status
Login as admin user into your Nextcloud and access 
http://example.com/index.php/settings/integrity/failed 
paste the results here.

No errors have been found.

List of activated apps:

App list
If you have access to your command line run e.g.:
sudo -u www-data php occ app:list
from within your Nextcloud installation folder

Enabled:

  • accessibility: 1.5.0
  • activity: 2.12.1
  • admin_audit: 1.9.0
  • calendar: 2.0.4
  • cloud_federation_api: 1.2.0
  • comments: 1.9.0
  • contacts: 3.4.1
  • contactsinteraction: 1.0.0
  • dav: 1.15.0
  • federatedfilesharing: 1.9.0
  • federation: 1.9.0
  • files: 1.14.0
  • files_pdfviewer: 1.8.0
  • files_rightclick: 0.16.0
  • files_sharing: 1.11.0
  • files_trashbin: 1.9.0
  • files_versions: 1.12.0
  • files_videoplayer: 1.8.0
  • firstrunwizard: 2.8.0
  • logreader: 2.4.0
  • lookup_server_connector: 1.7.0
  • nextcloud_announcements: 1.8.0
  • notifications: 2.7.0
  • oauth2: 1.7.0
  • password_policy: 1.9.1
  • photos: 1.1.0
  • privacy: 1.3.0
  • provisioning_api: 1.9.0
  • recommendations: 0.7.0
  • serverinfo: 1.9.0
  • settings: 1.1.0
  • sharebymail: 1.9.0
  • support: 1.2.1
  • survey_client: 1.7.0
  • systemtags: 1.9.0
  • text: 3.0.1
  • theming: 1.10.0
  • twofactor_backupcodes: 1.8.0
  • updatenotification: 1.9.0
  • viewer: 1.3.0
  • workflowengine: 2.1.0
    Disabled:
  • encryption
  • files_accesscontrol-disabled
  • files_external
  • groupfolders
  • richdocuments_old
  • user_ldap

Nextcloud configuration:

Config report
If you have access to your command line run e.g.:
sudo -u www-data php occ config:list system
from within your Nextcloud installation folder

or 

Insert your config.php content here. 
Make sure to remove all sensitive content such as passwords. (e.g. database password, passwordsalt, secret, smtp password, …)

{
"system": {
"instanceid": "REMOVED SENSITIVE VALUE",
"passwordsalt": "REMOVED SENSITIVE VALUE",
"secret": "REMOVED SENSITIVE VALUE",
"trusted_domains": [
"REMOVED SENSITIVE VALUE",
"REMOVED SENSITIVE VALUE"
],
"datadirectory": "REMOVED SENSITIVE VALUE",
"skeletondirectory": "/data/default_data",
"htaccess.RewriteBase": "/",
"dbtype": "mysql",
"version": "19.0.4.2",
"dbname": "REMOVED SENSITIVE VALUE",
"dbhost": "REMOVED SENSITIVE VALUE",
"dbport": "",
"dbtableprefix": "oc_",
"dbuser": "REMOVED SENSITIVE VALUE",
"dbpassword": "REMOVED SENSITIVE VALUE",
"installed": true,
"mail_smtpmode": "smtp",
"mail_smtpauthtype": "LOGIN",
"mail_smtpsecure": "ssl",
"mail_smtpauth": 1,
"mail_from_address": "REMOVED SENSITIVE VALUE",
"mail_domain": "REMOVED SENSITIVE VALUE",
"mail_smtphost": "REMOVED SENSITIVE VALUE",
"mail_smtpport": "465",
"mail_smtpname": "REMOVED SENSITIVE VALUE",
"mail_smtppassword": "REMOVED SENSITIVE VALUE",
"memcache.local": "\OC\Memcache\APCu",
"memcache.locking": "\OC\Memcache\Redis",
"redis": {
"host": "REMOVED SENSITIVE VALUE",
"port": 0,
"dbindex": 0,
"timeout": 1.5
},
"maintenance": false,
"theme": "",
"loglevel": 2,
"updater.release.channel": "stable",
"overwrite.cli.url": "REMOVED SENSITIVE VALUE",
"mysql.utf8mb4": true
}
}

Are you using external storage, if yes which one: local/smb/sftp/...
no

Are you using encryption: yes/no
no

Are you using an external user-backend, if yes which one: LDAP/ActiveDirectory/Webdav/...
no

Client configuration

Browser:
N/A
Operating system:
Win 10

Logs

2020-11-03 14:23:31:820 [ warning nextcloud.gui.activity ]: Item “Bibliography/VacuumLeaks/xxxx” retrieved resulted in “File name too long”

@florian-prd florian-prd added 0. Needs triage Pending check for reproducibility or if it fits our roadmap bug labels Nov 19, 2020
@Blackclaws
Copy link

I can confirm this is an issue. I have the problem appear when using encfs to encrypt a folder that is then synced via nextcloud. The issue crops up when the sync client tries to upload the file. It simply triggers an internal server error.

@compixonline
Copy link

I'm having this issue too, mostly caused by tags appended to filename by a utility "Tagspaces" which aids file organisation by the transferable and archive safe mechanism of adding tags in square brackets to the file name. I can't work out what the cut-off is for too long but I don't think it's up to even 150 characters. Using Linux debian client and unsure of server but looking into it. Slightly frustrating as the Nextcloud / Tagspaces combo makes for a long term viable lo-fi personal datebase.

@XutaxKamay
Copy link

XutaxKamay commented Apr 3, 2021

I'm having the same issue and sadly my experience is not great here with nextcloud but from what I saw the problem comes from how the current file transfer is done. Normally, on a GNU/Linux system filename shouldn't be more than 256 characters and a max path of 4096.

Now that I look on it (yes I'm just a dirty anime/vn watcher):

Error: file_put_contents(/usb0/nextcloud_data/xutaxkamay/files/Pictures/Grabber/erm/konachan.com___03-26-2019 13.15___280989___chihuri405(original+pixiv fantasia)___unknown___original pixiv fantasia chihuri405 blood dark dress elbow gloves forest garter belt gloves grass gray hair horns long hair red eyes stockings thighhighs t.jpg.ocTransferId1984429488.part): failed to open stream: File name too long at /var/www/nextcloud/lib/private/Files/Storage/Local.php#278

The path doesn't exceed 4096 characters, so it's all good at this point, but there's a problem..

When we see konachan.com___03-26-2019 13.15___280989___chihuri405(original+pixiv fantasia)___unknown___original pixiv fantasia chihuri405 blood dark dress elbow gloves forest garter belt gloves grass gray hair horns long hair red eyes stockings thighhighs t.jpg.ocTransferId1984429488.part = 277 characters, because of the junk added later. If it's not added, it's about 249-250 characters.

My idea to counter this would be to use a hash (could be a fast one like crc32) of the path/filename with the transferID/parts in a temp directory, so we can reconstruct later the file with the proper path and filename where the file that has been stored on the device didn't exceed the limits.

For example:

crc32("/usb0/nextcloud_data/xutaxkamay/files/Pictures/Grabber/erm/konachan.com___03-26-2019 13.15___280989___chihuri405(original+pixiv fantasia)___unknown___original pixiv fantasia chihuri405 blood dark dress elbow gloves forest garter belt gloves grass gray hair horns long hair red eyes stockings thighhighs t.jpg" + salt) = B299DF4D

/tmp/nextcloud/B299DF4D.ocTransferId1984429488.part1 /tmp/nextcloud/B299DF4D.ocTransferId1984429488.part2 ...

And retrieve the filename with the path by comparing hashes. (could be a hashmap)

I think CRC32 is good because it's fast and there's 2^32 possibilities of filenames which should be way enough for this purpose.

I would be glad if any developer on this project could do it but if not I guess I could give a try.

Thanks.

EDIT:
The temp directory would be somewhere in the same partition to increase performance (since we're moving files), so it could be a hidden directory of data/ (in case if the data directory is in another partition)

EDITEDIT:

Actually, there's still another problem when a path exeeds 4096 characters with the transferId/part which is kinda the same problem as above because of the long path used on the device that is syncing to the server would exceed after adding those.

Third problem, if the device has a shorter root path of the folders being synced than the nextcloud's data directory, files won't be written if the subpaths are close to 4096 characters..

Maybe creating a virtual file system (that would be the most preferable in my opinion since you wouldn't need to care how much things are exceeding, but this is kinda reinventing the wheel if we don't use something that exists already, could be just allocating a file and using it as a device) or using only hashes (which isn't terrible idea but if you wanted to backup those files on the server directly you wouldn't know what they are directly, you'd have to look on the database, and there can be, well rarely, but some collisions) could help but the more I advance, the more I see problems.

@ppochon
Copy link

ppochon commented Apr 7, 2021

Same issue here. Version 3.1.3 (Ubuntu).

@bubonic
Copy link

bubonic commented Apr 24, 2021

I can confirm this is an issue. I have the problem appear when using encfs to encrypt a folder that is then synced via nextcloud. The issue crops up when the sync client tries to upload the file. It simply triggers an internal server error.

I can also confirm this is the issue, but not the fault of Nextcloud. I'm on a ext4 filesystem which has a maximum of 255 characters for a filename. However, my home directory is using ecryptfs and by Mike Mabey's PhD notes, the way ecryptfs encrypts the filenames only allows for a maximum of 143 (ASCII) characters on the un-ecrypted filename - less if you have Unicode characters.

So, while my Nextcloud server uses ext4's 255 character limit, any file over 143 characters stored on the server and sync'd to my home directory will fail. It's rather annoying to see the big red icon with the white 'X'. Luckily, there were only a dozen so filenames, so I just manually edited them on the server.

Note: LUKS is a better encryption method imho, but the advantage to ecryptfs is no other user can view the contents of your home directory without your key, whereas LUKS is decrypted at boot or mount.

@MatthieuPERIN
Copy link

Hello, same problem here, plus nightmare limit issue because we have Win, Linux & MacOs desktop users (thus potentially 3 different filename / path length limits).

Alternate proposal:
If this limitation cannot be handle correctly by nextcloud, at least please:

  • mark the problematic files as "non synchronized" with specific mark or just error badge (same for folder) in the desktop application so that user know the issues plain and clear,
  • add an information in the server side using a tag to state witch user cannot synchronize the file and why (this might be a good addition to the history of each file ?)

@nicojx
Copy link

nicojx commented Jul 5, 2021

The problem with suggested solutions is that they assume actual user-friendly files are uploaded. For me, the problem appeared with files encrypted with gocryptfs, which also encrypts (and substantially lengthens) filenames. I cannot shorten these filenames, but would expect NC to successfully sync them.

@szaimen
Copy link
Contributor

szaimen commented Aug 8, 2021

This issue looks to me like issues with filesystems that you are using. I don't think that this is fixable by Nextcloud as we would need to provide workarounds for every filesystem out there that has a filename/path length limitation.

cc @nextcloud/server-triage on this if fixing this is feasible.

@nicojx
Copy link

nicojx commented Aug 8, 2021

For me the issue appeared on Linux, which doesn't have the Windows path length limit, so I don't think the issue is with the file system I was using. Dropbox dealt with the same file path on the same OS fine.

@Blackclaws
Copy link

For me the issue appeared on Linux, which doesn't have the Windows path length limit, so I don't think the issue is with the file system I was using. Dropbox dealt with the same file path on the same OS fine.

This is unfortunately only semi accurate. Depending on the filesystem used on linux there is indeed a path limit. Which also means that there is an intrinsic path limit on the server side at least when not using object storage. The question is whether the path limit on serverside couldn't be circumvented somehow.

@ghost

This comment has been minimized.

@ghost ghost added the stale Ticket or PR with no recent activity label Sep 10, 2021
@MatthieuPERIN
Copy link

this is still a valid problem for me !

@ghost ghost removed the stale Ticket or PR with no recent activity label Sep 10, 2021
@ghost

This comment has been minimized.

@ghost ghost added the stale Ticket or PR with no recent activity label Oct 10, 2021
@Blackclaws

This comment has been minimized.

@ghost ghost removed the stale Ticket or PR with no recent activity label Oct 12, 2021
@klipitkas
Copy link

This is still a valid issue.

@bubonic
Copy link

bubonic commented Jan 21, 2022

I suggest automatically truncating all filenames + ext to a length of maximum 143, if the length of the filename is > 143 ascii characters. Worry about unicode if it comes up.

This would solve any issues.

@MatthieuPERIN
Copy link

The suggestion of @bubonic is a very good one, I would recommend to truncate name to 140 + add some random/increasing digit to ensure non duplicative name and maybe the creation of a .txt file (truncated_namefile.txt ?) that list the initial names Vs the truncated ones to help user follow action taken.

@KarelWintersky
Copy link

It is a valid issue. Same problem, server Debian 11

@mattdale77
Copy link

I have experienced this issue. In 2022 this shouldn't reality be a problem. Do we know if this affects particular versions. I'm on 22 which I know I'm going to have to upgrade but other than this I havent experienced issues

@moussaCamara
Copy link

I have the same problem. The issue is still valid

Nextcloud could provide an option to automaticaly truncate filename bigger than the given number of caracters, this will allow one to set a 143 or watever size limit...

@ModischFabrications
Copy link

Stumbled over the same problem and made a quick script to truncate all filenames longer than 180 chars. It's obviously not a durable solution, but it's a cheap bandaid until someone fixes Nextcloud.

Use at your own risk:
truncate_filenames.sh.txt

@szaimen
Copy link
Contributor

szaimen commented Jan 23, 2023

Hi, please update to 24.0.9 or better 25.0.3 and report back if it fixes the issue. Thank you!

My goal is to add a label like e.g. 25-feedback to this ticket of an up-to-date major Nextcloud version where the bug could be reproduced. However this is not going to work without your help. So thanks for all your effort!

If you don't manage to reproduce the issue in time and the issue gets closed but you can reproduce the issue afterwards, feel free to create a new bug report with up-to-date information by following this link: https://github.com/nextcloud/server/issues/new?assignees=&labels=bug%2C0.+Needs+triage&template=BUG_REPORT.yml&title=%5BBug%5D%3A+

@Blackclaws
Copy link

Seems fixed. I can successfully create a long string of nested directories that in total vastly outnumber 250 characters at least on Linux. Haven't tested it on Windows yet. Syncs fine.

@szaimen
Copy link
Contributor

szaimen commented Feb 5, 2023

Thanks for verifying!

@szaimen szaimen closed this as completed Feb 5, 2023
@florian-prd
Copy link
Author

Many thanks for solving it, it works!

@kintaro1981
Copy link

Seems fixed. I can successfully create a long string of nested directories that in total vastly outnumber 250 characters at least on Linux. Haven't tested it on Windows yet. Syncs fine.

On windows I'm getting the error with files with filenames of 255 characters (it works with 200 characters). What's the limit?

@th-joerger
Copy link

This is still a problem, as during transfer, an transferId is added:

This error message occurs with the file. I changed the file name to remove sensible content, but the length and characteristics of all path elements is conserved:

file_put_contents(/var/www/html/data/UserNameLong/files/SomeCompany3/SomeFolder/Another1/UnnecessaryLongFolder12/Folder with Space Numbers.Dots/Folder with Space Numbers-Hyphens/An almost comically long file name with a description, commas, round brackets (at least one), winding description, very many words almost, completely maxing out windows file name limits with 245 chars and also ending in a date like 20241007.jpeg.ocTransferId2022520982.part): Failed to open stream: File name too long at /var/www/html/lib/private/Files/Storage/Local.php#304

The original filename is valid under Windows with 245 chars. During upload, a suffix is added which fails the filename length constraints.

@ironhak

This comment has been minimized.

@th-joerger

This comment has been minimized.

@solracsf solracsf reopened this Nov 20, 2024
@solracsf
Copy link
Member

solracsf commented Nov 20, 2024

The original filename is valid under Windows with 245 chars. During upload, a suffix is added which fails the filename length constraints.

Can you try to set 'part_file_in_storage' => false, in config.php and retry your upload?
This will not fix the underlying problem, but can help on some edge cases.

Please note, the limit is 250 chars in Nextcloud (and same Database name limit for rows in oc_filecache table).

@joshtrichards joshtrichards added feature: dav hotspot: filename handling Filenames - invalid, portable, blacklisting, etc. labels Nov 22, 2024
@th-joerger
Copy link

I could not replicate the issue with NC Server (30.0.2) and NC Desktop Client (3.14.3). I had since renamed the files in question. Renaming them back does not trigger the error.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
0. Needs triage Pending check for reproducibility or if it fits our roadmap bug feature: dav hotspot: filename handling Filenames - invalid, portable, blacklisting, etc.
Projects
None yet
Development

No branches or pull requests