Sure, some people might argue that there are specialized tools for each of these functions. And that’s true. But the tradeoff is that you'd need to manage a lot more with individual services. With Nextcloud, you get a unified platform that might be good enough to run a company, even if it’s not very fast and some features might have bugs.
The AIO has addressed issues like update management and reliability, it been very good in my experience. You get a fully tested, ready-to-go package from Nextcloud.
That said, I wonder, if the platform were rewritten in a more performance-efficient language than PHP, with a simplified codebase and trimmed-down features, would it run faster? The UI could also be more polished (see Synology DSM web interface). The interface in Synology looks really nice!
That said there's an Owncloud version called Infinite Scale which is written in Go.[1] Honestly I tried to go that route but it's requirements are pretty opinionated (Ubuntu LTS 22.04 or 24.04 and lots of docker containers littering your system) but it looks like it's getting a lot of development.
Hm?
> This guide describes an installation of Infinite Scale based on Ubuntu LTS and docker compose. The underlying hardware of the server can be anything as listed below as long it meets the OS requirements defined in the Software Stack
https://doc.owncloud.com/ocis/next/depl-examples/ubuntu-comp...
The Software Stack section goes on to say it's just needs Docker, Docker Compose, shell access, and sudo.
Ubuntu and sudo are probably only mentioned because the guide walks you through installing docker and docker compose.
And I wish it was "containerized" but really it's "dockerized" as this thread demonstrates: https://central.owncloud.org/t/owncloud-docker-image-with-ro...
So yeah like I said in my original comment, for personal use it's just not right for me (because I choose not to use docker in my personal projects), but I hope it's right for other people because it looks like a killer app.
I'd definitely like to see what other options are available on other distros so I'll dig through their documentation more.
And yeah, trying to use podman with something that's based on docker compose is ... probably gonna give you some headaches, I'd guess. I don't particularly know the pitfalls but if you're expecting it to be transparently swappable, I don't think that's an owncloud issue.
I haven't bothered to fix it.
You really consider 1 MB of JS too heavy for an application with hundreds of features? How exactly are developers supposed to fit an entire web app into that? Why does this minimalism suddenly apply only to JavaScript? Should every desktop app be under 1 MB too? Is Windows Calculator 30 MB binary also an offense to your principles?
What year is it, 2002? Even low-band 5G gives you 30–250 Mbps down. At those speeds, 20 MB of JS downloads in well under a second. So whats the math beihnd the 5–10 second figure? What about the cache? Is it turned off for you and you redownload the whole nextcloud from scratch every time?
Nextcloud is undeniably slow, but the real reasons show up in the profiler, not the network tab.
On paper. In practice, it can be worse than that.
I've spent the past year using a network called O2 here in the UK. Their 5G SA coverage depends a lot on low band (n28/700MHz) and had issues in places where you'd expect it to work well (London, for example). I've experienced sub 1Mbps speeds and even data failing outdoors more than once. I have a good phone, I'm in a city, and using what until a recent merger was the largest network in the country.
I know it's not like this everywhere or all the time, but for those working on sites, apps, etc, please don't assume good speeds are available.
I sometimes see +1Gbps with 100MHz of n78 (3500MHz), a frequency that wasn't used for any of the previous Gs, but as you are aware, 5G can also be deployed on low band and while more efficient, it can't do miracles. For example, networks here use 700MHz. A 10MHz slice of 700MHz seems to provide around 75Mbps on 4G and around 80Mbps on 5G under good conditions. It's better, but not a huge improvement.
The problem in my case is a lack of capacity. Not all sites have been upgraded to have faster backhaul or to broadcast the higher, faster frequencies they use for 5G, so I may end up using low band from a site further away... Low frequencies = less capacity to carry data. Have too many users using something with limited capacity and sometimes it will be too slow or not work at all. It's usually the network's fault as they're not upgrading/expanding/investing enough or fast enough... sometimes it's the local authority being difficult and blocking upgrades/new sites (and we also have the "5G = deadly waves" crowd here).
It shouldn't happen, but it does happen[0], and that's we shouldn't assume that a user - even in a developed country - will have signal or good speeds everywhere. Every network has weak spots, coverage inside buildings depends a lot on the materials used, large events can cause networks to slow down, etc. Other than trying to pick a better network, there's not much a user can do.
The less data we use to do something, the better it is for users.
---
[0] Here's a 2022 article from BBC's technology editor complaining about her speeds: https://www.bbc.co.uk/news/technology-63798292
First and foremost, I agree with the meat of your comment.
But I wanted to point about your comment, that it DOES very much matter that apps meant to be transmitted over a remote connection are, indeed, as slim as possible.
You must be thinking about 5G on a city with good infrastructure, right?
I'm right now having a coffee on a road trip, with a 4G connection, and just loading this HN page took like 8~10 seconds. Imagine a bulky and bloated web app if I needed to quickly check a copy of my ID stored in NextCloud.
It's time we normalize testing network-bounded apps through low-bandwidth, high-latency network simulators.
Pretty much the same with JavaScript - modern engines are amazingly fast or at least they really not depend on amount of raw javascript feed to them.
Yes, I don't know, because it runs in the browser, yes, yes.
Unlike many other projects it's surprisingly easy to get in a situation where the db is throttling due to IO issues on a single box machine. Having the db at on a seperate drive from the storage and logging really speeds things up.
That and setting up a lot of the background tasks like image preview generation, redis ect properly.
Nextcloud's client support is very good though and it has some great apps, I use PhoneTrack on road trips a lot
If every aspect of Nextcloud was as clean, quick and light-weight as PhoneTrack this world would be a different place. The interface is a little confusing but once I got the hang of it it's been awesome and there's just nothing like it. I use an old phone in my murse with PhoneTrack on it and that way if I leave it on the bus (again) I actually have a chance of finding it.
No $35/month subscription, and I'm not sharing my location data with some data aggregator (aside from Android of course).
I'm extremely tempted to write a lightweight alternative. I'm thinking sourcehut [1] vs GitHub.
It works very well, has polished UI and uses very little resources. It also does a lot less than Nextcloud.
Nextcloud is an old product that inherit from Owncloud developed in php since 2010. It has extensibility at its core through the thousands of extensions available.
So yaaay compare it with source hut ...
> So yaaay compare it with source hut ...
I'm not saying that sourcehut is the same in any way, but I want the difference between GitHub and sourcehut to be the difference between NextCloud and alternative.
> Nextcloud is an old product that inherit from Owncloud developed in php since 2010.
Tough situation to be in, I don't envy it.
> It has extensibility at its core through the thousands of extensions available.
Sure, but I think for some limited use cases, something better could be imagined.
In version 31 the frontend has been rewritten in Vue and with Nextcloud Office aka Collabora Online you get much more than a shitty GDocs.
Of course some apps like the calendar have not been rewritten.
Most readers do not understand what it takes to rewrite the frontend for an entire ecosystem.
82 / 86 requests 1,694 kB / 1,754 kB transferred 6,220 kB / 6,281 kB resources Finish: 11.73 s DOMContentLoaded: 1.07 s Load: 1.26 s
Some specific things I like about it:
* Basic todo app features are compatible with CalDAV clients like tasks.org
* Several ways of organizing tasks: subtasks, tags, projects, subprojects, and custom filters
* list, table, and kanban views
* A reasonably clean and performant frontend that isn't cluttered with stuff I don't need (i.e., not Jira)
And some other things that weren't hard requirements, but have been useful for me: * A REST API, which I use to export task summaries and comments to markdown files (to make them searchable along with my other plaintext notes)
* A 3rd party CLI tool: https://gitlab.com/ce72/vja
* OIDC integration (currently using it with Keycloak)
* Easily deployable with docker composeEither apps lack such an export, or its very minimal, or it includes lots of things, except comments...Sometimes an app might have a REST api, and I'd need to build something non-trivial to start pulling out the comments, etc. I feel like its silly in this day and age.
My desire for comments to be included in exports is for local search...but also because i use comments for sort of thinking aloud, sort of like an inline task journaling...and when comments are lacking, it sucks!
In fact, when i hear folks suggest to simply stop using such apps and merely embrace the text file todo approach, they cite their having full access to comments as a feature...and, i can't dispute their claim! But barely any non-text-based apps highlight the inclusion of comments. So, i have to ask: is it just me (who doesn't use a text-based todo workflow), and then all other folks who *do use* a text-based tdo flow, who actually care about access to comments!?!
<rant over>
My use case looks roughly like this: for a given project (as in hobby/DIY/learning, not professional work), I typically have general planning/reference notes in a markdown file synced across my devices via Nextcloud. Separately, for some individual tasks I might have comments about the initial problem, stuff I researched along the way, and the solution I ended up with. Or just thinking out loud, like you mentioned. Sometimes I'll take the effort to edit that info into my main project doc, but for the way I think, it's sometimes more convenient for me to have that kind of info associated with a specific task. When referring to it later, though, it's really handy to be able to use ripgrep (or other search tools) to search everything at once.
To clarify, though, Vikunja doesn't have a built-in feature that exports all task info including comments, just a REST API. It did take a little work to pull all that info together using multiple endpoints (in this case: projects, tasks, views, comments, labels). Here's a small tool I made for that, although it's fairly specific to my own workflow: https://github.com/JWCook/scripts/tree/main/vikunja-export
Yeah, i like me some kanban! Which is one reason i've resisted the text-based workflow...so far. ;-)
> ...Vikunja doesn't have a built-in feature that exports all task info including comments, just a REST API. It did take a little work...
Aww, man, then i guess i misread. I thought it was sort of easier than that. Well, i guess that's not all bad. Its possible, but simply requires a little elbow grease. I used to use Trello which does include comments in their JSON export, but i had my own little python app to copy out and filter only the key things i wanted - like comments - and reformated to other text formats like CSV, etc. But, Trello is not open source, so its not an option for me anymore. Well, thanks for sharing (and for making!) your vikunja export tool! :-)
maybe paying customers are getting a different/updated/tuned version of it. maybe not. but the only thing that keeps me using it is there isn't any real selfhosted alternatives.
why is it slow? if you just blink or take a breath, it touches the database. years ago i've tried to optimise it a bit and noticed that there are horrible amount of DB transactions there without any apparent reason.
also, the android client is so broken...
but the feeling is that the outdated or simply bad decisions aren't fixed or redesigned.
it could be made 100 times better.