This CMS is a simple tool for users to build and deploy their websites.
Deployed websites report back to the CMS the activity and performance of each page visit. In this way, website authors get to “know” their visitors and their experience interacting with the site in real time.
Google Analytics, by contrast, is a comprehensive tool that needs to be configured separately in popular website builders and offers realtime reporting only for the last 30 minutes.
This CMS deploys websites with real time analytics built in and reports the most salient details - the who, when, where and from where of each page visit plus elapsed time spent on each page as well as the core performance metrics that Google uses in its search ranking algorithm.
Having said all that, the purpose of this post is to capture ideas for improving the CMS.
Allow custom domain website owners to receive their emailed activity reports according to their own frequency / conditions. Currently, emails are sent daily as part of the daily backup job, but this can quickly become quite boring - so we should allow website authors to choose for themselves when and under what conditions these reports are emailed.
Think about (maybe) use of attribution, i.e. import {onLCP, onINP, onCLS} from ‘web-vitals/attribution’ to report on the identity of those page elements that cause sub-optimal Google CWV ratings.
For “blog” pages (maybe “media” collections too) we ought to offer a “comment” option so readers can submit comments and have these appear as a list following the blog entry. This would improve engagement, but requires a lot of work, including:
Include actual bytes transferred in performance reporting. Caveat re. Safari not reporting this until version 17. Maybe website authors want to know which media have been “seen” on their sites, in addition to which pages have been visited. Idea is to also convey the cost of including media on web pages and raise awareness of importance of optimising media for delivery.
Determine how to use webauthn for password-free, secure authentication. Refer to https://tm-apex.hashnode.dev/enabling-passwordless-authentication-in-apex-using-oracle-identity-and-access-management
Would be very nice to offer this as an alternative to Google authentication. Also would be nice to implement proper password-less authentication.
Configure this to show all features. Personalise the domain name. Don't allow this website to be deleted. Explain that it provides detailed explanations of all website building features etc. Maybe auto-publish it on initial connect, so user can “see” the deployment process and immediately access the site with their name on it.
Enable user to restore article content as of a user specified time. Oracle provides flashback queries that could make this possible but this would soon break down in a multi-user environment due to limited rollback configuration.
This should be provided in addition to the configured CKEDITOR undo / redo feature which is available only for the content currently being edited.
Perhaps we could keep a time-stamped copy of each article.body_html before it is pulled into the editor? Could maintain this in the “other” Oracle database that comes as part of the “Always Free” subscription and access it over a DB-LINK; that would offer reasonable performance and put that “other” database to some use in order to avoid losing it through non-use.
Currently, you can open individual pages in Codepen which is only useful if you understand HTML / CSS / JAVASCRIPT.
Also provided is the option to upload any CSS and/or JAVASCRIPT that you change in Codepen - the idea being that savvy users can replace my code with their own. If you do this, then the CSS and JAVASCRIPT delivered with each deployment is yours. Really not sure about this option - seems dangerous, although offers flexibility to experienced website developers.
Change the sending email address to info@adfreesites.com and change the email text so that receivers of contact forms are not too alarmed.
Solution: Restart the ADB instance from the OCI console
According to an Oracle support note, this is resolved from version 23? Therefore, should deploy the CMS on a version 23 database. Currently it's running on version 19 which is due for the chop this year.
The automatic daily backup job has run successfully hundreds of times. However, there have been 2 occasions when the job failed, both of which were related to GITHUB:
Resolved by simply running the job manually like this:
BEGIN
dbms_scheduler.run_job('DAILY_BACKUP');
END;
/
I have no idea what caused these failures. Maybe I should use the DBMS_CLOUD package instead of my own code to interface with GITHUB?