Advice needed for webdev

I may have bitten more than I can chew… Trying to develop a new wordpress based site on Google Cloud Platform while integrating a couple APIs that are based on REST. My experience is limited to playing around with php scripts, css fixes and a dabbling of js for mostly front-end stuff, but feel like I’m very out of date with all the different platforms, tools and environments. I’m used to just running a simple site where I’d just upload php files via ftp to a hosted server without any external calls. I’ve never really set up a server from scratch so I haven’t played with the unix stuff much. Any time I had to touch the server I used a cPanel backend.

Can someone recommend a workflow? I know there’s a lot of industry practices I feel like I’m missing like building a dev environment, using git to push updates, and working locally vs directly with the server.

The APIs i want to use seem to use a dependency manager called Compose, but I don’t really understand how this interfaces with what i want to do. Is it as simple as installing Compose and the package I want on the Unix end then making calls in my php scripts (from inside wordpress htdocs directory)? Am I just overthinking it? I’ve been struggling to find a guide or documentation thats relevant to my setup. I feel like there are lots of unknowns because I’m attempting to set this up inside a GCP Bitnami Deployment for Multisite Wordpress and working with the VM is a lot lower level than I’ve worked with in the past.

I’m basically plowing through this head first without much guidance and with my google-fu it’s painstakingly slow progress. Especially when there are so many platforms to work with, it’s starting to get blurry what level things operate at.

Any help is appreciated.

Project Specs:
GCP Bitnami Wordpress Multisite Deployment running:

  • OS: Debian (9)
  • Ghostscript (9.05)
  • Apache (2.4.41)
  • ImageMagick (6.9.8)
  • lego (3.3.0)
  • MySQL (8.0.18)
  • OpenSSL (1.1.1d)
  • PHP (7.3.14)
  • phpMyAdmin (5.0.1)
  • SQLite (3.31.1.)
  • Varnish (6.0.6)
  • WordPress Multisite (5.3.2)
  • WP-CLI (2.4.1)

APIs I’m trying to integrate: Hubspot and Quickbooks
Future work: Collecting data from ESP32 based sensors (not sure if I should host this separately/locally and only integrate relevant information)

Composer is PHP’s library dependency manager tool. You use it to download and “install” dependencies for your application. Think like apt-get but local to the root of your application. It would download the php files for the given library local to your app and make them available in include path resolves. So the workflow with source control you would only commit the compose(.json i think?) file to your repo instead of copies of the actual dependencies. Then you would have a continuous integration environment/pipeline that checks out that code, runs the compose install to download the dependencies for your app then deploys or say creates an image, either a machine image or just an app image like docker. This is so you can keep your repo smaller and manage dependencies updates/interdependencies easier.

Im not super familiar with bitnami, but I believe in your case it’s effectively a machine image with the php stack + WordPress pre-installed.

Assuming you’re just making a separate php app/site that happens to co-exist on this same server that has WordPress, either you straight upload the folder you have for your scripts to a path that is accessible via your WordPress pathing or maybe you create another site/path in your apache config… im not sure how the mutlisite WordPress is intended to be configured.

So you install the dependencies with composer locally (your local machine “building” the app) then upload everything via sftp (ftp over ssh) to your server. Or you could commit your code to a git repo, and have your server checkout a deploy branch or master and run an install script that runs composer and moves the files to the right place. You could do a cron job to occasionally check if there’s new changes in your repo, or go more advanced and use a GitHub webhook, etc.

This all kinda depends on what php “scripts/apps/sites” you’re adding to your WordPress deploy or if you’re modding WordPress, etc but assuming youre just leveraging the ease of launching a WordPress site via a bitnami container then modifying that running instance to just add your new stuff then my comments are probably relevant. If you’re trying to make the app horizontally scalable, etc you might be more looking for creating bitnami or k8b containers, shurgs.

Hopefully there is some context or keywords in my ramblings that helps your google fu!

1 Like

I think it is worth noting that apparently WordPress can use the SQLite database engine instead of an SQL database server such as MySQL or MariaDB, which prevents the failure mode that the Web/HTTP server is up but the Web site is still down because the database server is down because SQLite is a shared library used by the HTTP server instead of a separate process than the HTTP server. That said, I have no experience using WordPress with SQLite instead of an SQL database server but would be interested to hear your experience if you try using WordPress with SQLite.

Also, you should probably use the Secure Shell File Transfer Protocol or Secure Copy (scp) or Secure Shell File System (sshfs with FUSE) or something else other than the oldschool File Transfer Protocol. Note that, confusingly, both Secure Shell File Transfer Protocol and Secure File Transfer Protocol are abbreviated to SFTP even though they are different.

Also, do you have a specific reason for using Debian 9 because the current release of Debian is 10?

Thank you guys for the responses so far. It’s given me enough to at least take the next step deeper…

What I’ve been able to do so far is setup a SFTP link so I can now upload files directly to the server. That fixes a little bit of the workflow stuff, though it appears the FTP user i set up doesn’t have full permissions so I’m getting access restrictions when I try to download or upload certain files. I have to change a file 776 before I can overwrite a file so I think I have to do a chown or usermod command to fix it. Still googling that one…

I’ve also gone and installed composer as well as the API packages that I wanted, that went fairly well without a hitch.

Now i’m kind of stuck figuring out how to actually use the APIs, authenticate and make calls. Annoyingly all the sample code seems to be built around something called Docker which I’m not familiar with at all. When I find any sample code, it’s a whole repository that’s broken down into many files each for different functions. Working with a SQL database on the same server I’d just pass a user ID and passcode then start making Select calls all within the PHP page. Seeing them break it up like this makes it difficult for me to understand the structure of what they’re doing. I feel like I could probably just import all the demo code and just see if it works, but I’m not even sure what directory level I should be installing this stuff. Since there’s at least one config file with the authentication keys is it safe to deploy it inside the htdocs of my wordpress directory? Is it just a matter of setting permissions on that file? I can see that the wp-config.php is set to 640 which doesn’t even allow me to download it via ftp or open it in ssh without a sudo authority.

Ultimately what I am attempting to do is call data from these APIs and wrap it inside the wordpress front-end. I’ll probably be writing a lot of the front end code to make it look nice, but I’m trying to hang on to some of the navigation panels provided by wordpress to connect it to the rest of the site. The goal is to centralize a lot of this information in one place which will hopefully make administration and user experience a lot simpler. Right now, accessing 4-5 different platforms is a nightmare for the admin side and users currently have no way to interface that so we have to do it for them. This project is meant to simplify everything (by making me into a code monkey right now)

@brolin the server is using debian 9 as that what the deployment automatically installed. I didn’t have a choice in the services installed really in exchange for getting the whole thing set up in about 5 minutes.
It’s something I’ll have to look into later with regards to how I’m supposed to update some of these things in the future.

Do you know how to use scp (Secure Copy)? You should not need to use FTP if you have Secure Shell (SSH) access to the remote host because you should be able to use scp to copy files between hosts. As with cp, you should usually use the -p option to preserve the file system timestamps, at least the last modified time, when copying a file. Yes, this should be the default behaviour and the option should be to suppress this behaviour for only one run but this bad design is long established.

Do you know how to use rsync (over SSH)?

Not sure it matters. Yes I had SSH access, but I could only modify my htdocs directory by taking root authority. Even if I used my ssh user I think I would have experienced the same issue. I was able to find a solution that involved changing the access control lists to allow my ftp user to rwx for that whole directory.

The issue is that the directory was actually owned by a ‘bitnami’ user as I suppose that’s how it set up in order to deploy. I believe the ssh user is created separately with GCP to access the VM which puts it into a different owner/group. The FTP user was the same.

Anyways, I’m happy to report that I’ve gotten one API to work! Quickbooks had much simpler sample code and though it took a little bit of debugging I was able to make my first call. Horray!

Now if only I could get hubspot to do the same. I feel like I was able to wrap my head around it a little better with QB, but hubspot doesn’t have as good documentation on it.

BTW my composer ended up getting installed into /opt/bitnami/app/vendor/composer directory
however, the php files are over in /opt/bitnami/app/wordpress/htdocs/QBtest/ and need to make a require_once statement referencing the composer autoload.php in the first directory.

before i modified the code it used to say something like require_once(__DIR__,’/vendor/autoload.php’);

was I supposed to install composer closer to the execution directory? Is it bad practice to hard code the full directory path?