#A4#
# chck_time_of_operation(
order_by(
select(table,columns[],?where_filter(conditions[],operations[]))
,?properties[])
); //rem: for empties is better false, not strict types enabling necessary#
##
##
#dont't forget
transactions+agregs fns
+3 joins fns#
##
#JSON_FILE_DB#
##
#System Kerberos procedural language PL specification draft:#
##
##
##
##
#How to achieve indexing? => Automated column sorting+Fast search algorithm as log(n)?#
##
#Joins used in db would be: inner join, left join, right join, then table[] in select must be arrays=>0or>0#
##
##
##
##
#Think how to bypass PHP stack limit 128MB heaping of functions if DB had a problem with innering of fns?
One solution is pagination of memory or returns. Another might be views from Postgres known DBs,
eg I see the peep of the first hundreds records, then I continue the page to see another chunk only
up to mem limits.#
##
#Consider generators yelding vs loops breaking/continuing in particular code parts.
And what is yelding? Isnt it jupming out of fn, then jumb back on super global vars?#
##
##
#Aim: To have a database freely running PHP and to be limited only by the speed of fast SSD disk on your hosting.#
##
#Just BRAINSTORMING, can be more on this >>>#
##
##
#End of every CRUD fn is flagged, if end flag isnt present=>no transaction!#
##