Creating your own cloud with Retool and AWS S3

Creating your own cloud has never been easier, now you can use S3 and Retool to create one.

Have you wondered how you can design an asset management system or your own cloud system on top of S3. With Retool you can design low-code applications that would otherwise need a lot of code to write and test. While you can always use the S3 UI directly, what if you want to have your own features on top or want to separate access. Then you need to consider abstraction. Low-code tools like Retool that allow to design apps (specifically internal apps for Retool) have some hate from developers, as they do abstract all the nitty gritty details of everyday UI work. I will discuss why you should consider low-code tools in the remarks section and why low-code are more useful than most people think. For now, this is part 1 of a series of creating our own cloud abstracted in Retool hosted in S3. It will contain an uploader, reader and deleter. In this part will tackle uploading to S3 with random and selected name files. It may seem unnecessary at first but throughout this series you will come to appreciate the low-code and speed at which we get this product going. I expect you have basic knowledge of AWS IAM, AWS S3 and Retool. If not the following links will help you get acquainted: - https: //docs.retool.com/docs/quickstart - https: //aws.amazon.com/premiumsupport/knowledge-center/iam-intro/ - https: //docs.aws.amazon.com/AmazonS3/latest/userguide/GetStartedWithS3.html Will use AWS CLI v2 for all AWS operations since the UI may change but the CLI is constant. This tutorial is complimented and inspired by the official Retool tutorial: https: //docs.retool.com/docs/s3-integration-1 Note Retool has many plans and am using the free plan. S3 is charged per GB month. ## TLDR - Create and configure an S3 bucket in which what we upload will be put there. - Add IAM user to write to S3 from Retool - Create a Retool app to host the S3 uploader ## Create an S3 bucket Will create an S3 bucket named `ramihikmatnet-testing1`, you can choose to name it anything: ```sh aws s3 mb "s3://ramihikmatnet-testing1" ``` Let's create a local file named `s3-cors.json`. Now this file will hold the CORS policies for Retool to work with, which are below. It is only needed to add what it contains to the bucket. The reason we need to set CORS settings because Retool will run S3 insert commands using the browser API. The reason for that is because there is already an `S3 Uploader` component in Retool that we will use. ```json { "CORSRules": [{ "AllowedOrigins": [ "https://*.retool.com" ], "AllowedMethods": [ "PUT", "POST", "DELETE" ], "AllowedHeaders": ["*"] }, { "AllowedOrigins": ["*"], "AllowedMethods": ["GET"] } ] } ``` Let's set the policies to the bucket: ```sh aws s3api put-bucket-cors --bucket "ramihikmatnet-testing1" --cors-configuration file://s3-cors.json ``` Now the bucket is ready to use. Now we set 2 policies, 1 for public consumption which is the second object in the `CORSRules` array above in the JSON and the other for Retool to send `DELETE`, `POST` and `PUT` requests. You are not allowed to make a publicly writable and hence why we provided the Retool domain in the first policy, which restricts writes to this domain only. That is not to say that you can't introduce more rules for your own sake. ## Create an IAM user Since we will access and consume S3 we will need a user to do it. Now a `user` is an AWS concept of an entity that has some access to the main AWS account or organisation it is in. Creating the user first as follows: ```sh aws iam create-user --user-name ramihikmatnet-testing1-retool-user ``` Then I need to give it some permissions to do, you can attach standard AWS or custom policies and also inline policies. An inline policy is only found within the scope of the user and is what we need for this case. Will save the following permissions in JSON as `user-policy.json`: ```json { "Version": "2012-10-17", "Statement": [{ "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketAcl", "s3:GetBucketCORS", "s3:GetBucketLocation", "s3:GetBucketLogging", "s3:GetBucketNotification", "s3:GetBucketPolicy", "s3:GetBucketWebsite", "s3:GetObject", "s3:GetObjectAcl", "s3:GetObjectVersion", "s3:GetObjectVersionAcl", "s3:PutObject", "s3:PutObjectAcl", "s3:PutObjectTagging", "s3:PutObjectVersionAcl", "s3:PutObjectVersionTagging" ], "Resource": [ "arn:aws:s3:::ramihikmatnet-testing1", "arn:aws:s3:::ramihikmatnet-testing1/*" ] } ] } ``` Those permissions are read and write to the specific resource which is the bucket we created above. Those don't include delete operations. Now we can add the inline policy on the user like below. **Note, in order for this to succeed the user you are using to add permissions to the new Retool user should have ability to add those permissions above or it is not possible to proceed** ```sh aws iam put-user-policy --user-name ramihikmatnet-testing1-retool-user --policy-name RamiHikmatNetRetoolPolicy --policy-document file: //user-policy.json ``` All well and good, we do have a user now. How can we use it? well in AWS you can generate programmatic access keys which are a combination of access and secret keys that can be used as the user we created. The following command will return you those keys, please keep them safe and if forgotten revoke them and create new ones: ```sh aws iam create-access-key --user-name ramihikmatnet-testing1-retool-user ``` ## Create a Retool resource Now I expect you to know something about Retool now. A Retool `resource` is a source of data, it could be a DB connection, Google Sheets or any of the other sources Retool supports. Now `S3` is a source we can use. While an `app` is a Retool application that can be made out of components using resources. We will need to create a Retool resource to access S3 and an app for the UI to interact with the resource. Note an app may consume an unlimated number of resources. Let's head to the resources tab up top and click on it. After clicking `Save Changes`, a dialog will pop up and prompt to create an app or go back to the resources list. Will choose `Create an app` then add the name and create the app ![Creating Resource, step 1 ](https: //ramihikmatnet-files.s3.ap-southeast-2.amazonaws.com/4f3193d6-d755-4322-bcec-98253d38173b.PNG) ![Creating a Resource, step 2 ](https: //ramihikmatnet-files.s3.ap-southeast-2.amazonaws.com/861f5c4d-1707-44bb-ad0c-33dac59f4efa.PNG) ## Create the Retool app Now we're in the app. Let's add the components we use: Will add the `S3 uploader` by dragging and dropping it to the canvas. ![Adding S3 Uploader component ](https: //ramihikmatnet-files.s3.ap-southeast-2.amazonaws.com/b2c199c0-0584-45d3-b196-934f4dde62a2.PNG) Will configure it on the right hand side (your layout maybe different). Set the component to set uploaded items' `ACL` to be `public-read`. Your use case may vary, but I want all uploaded files to S3 to be publicly readable if accessed. Will also set the resource it uses to the name of the S3 resource we created earlier. You can now try uploading, if all successful let's continue to add a name. Now before doing it, I would love to set files created without a name to have a random numeric id and files set with name to have the name plus `-` and current date unix timestamp. Now will add a text field component to add the name. ![Configuring to use text input ](https: //ramihikmatnet-files.s3.ap-southeast-2.amazonaws.com/880f24a8-3557-4dc5-b8ec-3194a46cca51.PNG) Will update the uploader component's `Override file name` field from empty to add some JS code, like below: ```js { { textInput1.value ? `${textInput1.value }-${Date.now() }` : `${_.random(1000, 999999999) }${Date.now() }` } } ``` Now when you add `{ {} }` in Retool, whatever inside the middle of that is Javascript code. The code added is basically the logic I want for the name of files uploaded as denoted above. That's it, it is working. **With name** ![With name ](https: //ramihikmatnet-files.s3.ap-southeast-2.amazonaws.com/f58361c2-04ad-4fa6-bb0f-bfcc44726678.PNG) **Without name** ![Without name ](https: //ramihikmatnet-files.s3.ap-southeast-2.amazonaws.com/bc112dbf-9bfe-4f24-b157-18f0d3a045f7.PNG) ## Remarks Why did we go with Retool, we could have written the code ourselves right? Of course. However, Retool provides a strategic advantage of letting you design internal apps without worrying about coding. Coding does take a lot of time to achieve the same outcome here, although it maybe more satisfying for the audience reading this tutorial who are programmers by trade. It is often misunderstood that Retool is noted as a job replacer but in reality it is a tool that helps enchance productivity by worrying about new challenges not ones already solved. It can be used for all types of stuff starting with data analytisc dashboards, internal tracking and any use that involves an API. My argument is basically about productivity and keeping up with the age of automation, in order to advance in the career of software engineering we need to delegate more manual and easy tasks to tools. The value of the programmer is in solving problems and getting things done not about using a certain tool, though we used Retool, it has it's fair share of competition. Here are some articles that maybe of help to you on that matter: - https: //hackernoon.com/the-pros-and-cons-of-low-code-development-4y2p33g9 - https: //codebots.com/low-code/what-is-no-code-the-pros-and-cons-of-no-code-for-software-development - https: //www.nintex.com/blog/see-why-no-code-solutions-are-important/ Thanks for reading through this