How do I get a list of the URLs without having to manually click on them? I don't have time to create a list of 100s of URLs manually.
Sure but for what? My request to download subfolders or to be able to receive the current subdirectories names or your glob'ing idea?
-
-
for sub folders just do clone against that subfolder guid ? issue for ls expand to include guid could inc names & new issue for glob'ing
-
I tried to clone subfolder but got whole osf — can you link me to the way to do it please?
-
p.s. thank you for continuing to engage! these are great use cases that we might not be finding out about so quickly but for you!
-
I ideally want to put my data on OSF and my code will decide to download what and how. It's really big data so cloning the whole OSF is a
-
big ask for my readers/users. I will obviously give them the option of downloading the full 5gigs... but it would be nice to be able to let
-
them have fewer big files, that would still allow some data fun!


And you are welcome, thank you all for making this and listening! -
Hm, you could organize your OSF project w components & use osfcli, or script against api for specific folders from one proj wo components.
-
it is unclear to me exactly which things will work in this outline - let's figure that out ;)
- 5 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.