-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
YALB-1550: Single Content Sync #409
Conversation
Created multidev environment pr-409 for yalesites-platform. |
c74ee8f
to
20632c5
Compare
I have 2 worries:
|
There's work being done for drush import/export in this issue. Do we need our own Drush commands? |
Oh yeah you're right! They're controlling the entity type so that was the only thing I thought of as to why we'd need to keep mine, but they've already thought of it. Ya I'll take it out. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Most of this worked great, but a few issues:
- The Manage All Taxonomy view was still there after I disabled the Starterkit config config split, I think because that config is in
/config/sync
and not/config/starterkit_config
- I was able to import smaller exports no problem (taxonomy terms) but even just a few nodes zipped into a 126MB zip caused my local and Pantheon to not accept the file. I got an Ajax error in the console:
Local:
An AJAX HTTP error occurred.
HTTP Result Code: 413
Debugging information follows.
Path: /admin/content/import?element_parents=upload_fid&ajax_form=1
StatusText: error
ResponseText:
413 Request Entity Too Large
413 Request Entity Too Large
nginx
"
Pantheon:
"An AJAX HTTP request terminated abnormally.
Debugging information follows.
Path: /admin/content/import?element_parents=upload_fid&ajax_form=1
StatusText: error
ReadyState: 0"
@codechefmarc For the first point, I am so glad you said that. My biggest worry was hanging the permissions of the "Manage All Taxonomy" on another permission set in order to attempt to hide it, but I can just include it in the config split! I never even thought of the fact that it has a yaml config! DUH! As for the other, I've not had that issue, but did mostly do my testing locally, and I bet for the multi devs I did small sets of files. Thank you so much for this info. I'll get to work on it. |
eef3ede
to
eae3edc
Compare
It looks as if we have some limits on the file size uploads both in nginx and in PHP. Locally using lando, nginx looks to be limiting to 80M, while PHP is limiting to 100M. I'm very confused as that means my uploads must have been less than 80M? I could have swore I went higher. In any event, the uploads are indeed being limited by nginx, resulting in a 413: Request Entity Too Large. I'm looking into if we can fix this with a .htaccess modification or if we have to look into overloading the php.ini/nginx.conf. |
If this is the file size limit for uploads, perhaps we can get around this by doing the hosting of the zips on S3 or via another repo and importing from those places instead of a direct upload? That may be some custom code, but, would be fun to figure that one out. (Well for fun for some!) |
@vinmassaro contacted me and wanted me to check out the status of the drush commands they're working on. If it's good enough, we might be able to get around it for now using drush. But I bet that's why they implemented the S3 feature. haha It looks like their drush command works similarly to mine where it exports to a directory and we have to go get it. The import needs to be accessible by the web root to import. I am still going to look more into it. I've not dealt with S3 so it'd be neat to try if this doesn't get us there. |
Great news! I was able to get drush debugging done, and noticed that it was erroring on the local files directory (permissions maybe?). In any event, I wiped completely and did a fresh pull with the patch and it worked! I'm rebuilding now both a yalesites dev and a dev version of yalesites.yale.edu so I can get real content from them to test it one more time. I think it's working! I'll provide instructions in the PR above when I am certain. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The drush
command worked beautifully to import! I tested locally only for now (didn't want to mess up the multidev) but also question: I had to move the zip file into the web folder for it to work (I tried using a relative path outside of web but it didn't find it). So, would this take an external URL so we could do S3 / GitHub repo to store these zip files? I assume you looked into that.
Ya that is a limitation currently--when this command was worked on, the S3 implementation hadn't been done yet, and the branch is behind that commit. My guess is that it'll be added, but for now it is only zip files within the web directory. Currently there is a "getRealDirectory" method that is called that returns the path, but it currently doesn't account for the S3 situation. |
And now that I'm looking at the implementation, it's not that the zip file can reside in S3, it's that image assets residing in S3 that are referenced by a node properly get imported (no save is attempted). So we might still need some work pointing to a URL to download a zip. |
Placing this here in case we need it ever, there was talk of a https://github.com/drush-ops/drush/blob/master/examples/Commands/SyncViaHttpCommands.php#L60 |
51740d6
to
6880474
Compare
@vinmassaro The changes we talked about are in place.
When you can, give it another try, and thanks for talking this through with me yesterday. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great. I was able to export starterkit content and import it into a clean local install with no issue. Approved!
This update allows exporting and importing of nodes, media, and taxonomy through either a graphical interface or through a drush command. Keep in mind that due to limitations of file size uploads with PHP, Pantheon, and Drupal, importing large amounts of data should be done via Drush. * Add single_content_sync module and enable it * Add custom bulk actions for exporting taxonomy and media in the same way as nodes * Add processors for types not handled by single_content_sync * Embed * Markup * SmartDate * ViewsBasicParams * Add custom admin views for exporting nodes, taxonomy, and media, accessible through Configuration->System->Export Nodes * Add ZipArchive patch * The open method has been deprecated, and a message appears displaying this when attempting to export via a zip file. This patch squelches this. This has been fixed in Drupal 10.1, but is still present in 9.5
00f6e3d
to
7c6adff
Compare
YALB-1550: Single Content Sync
Description of work
/config/system
to assist in graphical exportsFunctional testing steps:
Via drush
lando drush content:export nodes ./exports --translate --assets
/app/web/
lando drush content:import ./exports/<name_of_the_zip_file_that_was_made>
[notice] Message: Successfully imported the content.
and received no errors.lando drush content:export media ./export --assets --translate
lando drush content:export taxonomy_term ./export --assets --translate
The reason the
--all-content
flag is not used is that it includes block_content as well. If we'd like that, then --all-content could be passed to the first node version and it would return them all. Technically since the UUIDs would be the same, it should be ok, but I'd be more explicit of what we're targeting.Via site
Configuration->System->Export Node
Export content
is in the Action drop down at the bottom of the listingConfiguration->System->Export Node->Export Media
Export media
is in the Action drop down at the bottom of the listingConfiguration->System->Export Node->Export Taxonomy
Export taxonomy
is in the Action drop down at the bottom of the listingExport X
is selected for the bulk action and click theApply to selected items
buttonImporting
Due to the size of the content zips, importing should be done via Drush