Integrating Smush To Optimize Stored Images

    • 275 posts
    April 17, 2020 10:34 PM EDT

    Has anyone Integrated Smush for image optimization

     

    I did try them on some word press sites , they are fantastic.
    they Work seamlessly and are also integrated with CDN

     

    I was wondering if its possible to run smush on all the images that are stored by Social engine and its 3rd parties

     

    Where we can install and run it on all the dierctories

    Thanks !

    • 265 posts
    April 19, 2020 2:16 PM EDT

    I just googled "smush images" and it came up with a wordpress plugin. It looks like it does something similar to Google's mod_pagespeed except it only deals with images?

     

    It might be easier to integrate pagespeed with whatever you are using as essentially it just acts as an intelligent caching proxy, and once configured it wouldn't actually need intergrating with SocialEngine.

     

    I have it on my list of things to do with my site.

     

    If you want to start reading about it, have a look here, in particular I would start by enabling the rewrite_images and resize_rendered_image_dimensions directives.


    This post was edited by abuk at April 19, 2020 2:31 PM EDT
    • 275 posts
    April 20, 2020 11:56 PM EDT

    the goal is to manage the size of stored uploads (AWS S3) of images by users.

    It can get enormous and by optimizing the size before its saved, it could reduce storage space.

     

    Since we also use 3rd party plugins (audio,video,photo) to upload , it can get messy 

    the trick I think is to optimize the uploaded image just before its saved in storage.

    Thanks will read up on your link:)

     

    Am trying to implement this awesome custom solution by  PeppaPigKilla 

    using his lambda script

    https://community.socialengine.com/forums/topic/1472/image-optimisation-kraken-io/view/post_id/8410

    1) store images  in S3

    2) Optimize image before it gets saved in S3

     


    This post was edited by playmusician at April 21, 2020 12:03 AM EDT
    • 275 posts
    April 20, 2020 11:57 PM EDT
    abuk said:

    I just googled "smush images" and it came up with a wordpress plugin. It looks like it does something similar to Google's mod_pagespeed except it only deals with images?

     

    It might be easier to integrate pagespeed with whatever you are using as essentially it just acts as an intelligent caching proxy, and once configured it wouldn't actually need intergrating with SocialEngine.

     

    I have it on my list of things to do with my site.

     

    If you want to start reading about it, have a look here, in particular I would start by enabling the rewrite_images and resize_rendered_image_dimensions directives.

    I was looking at smush, cause Id rather leave the optimization to pros like smush via api or kraken.io so they can worry about the constantly evolving techniques in the future.

     

    Smush I tried out on a wordpress web site and its brilliant.Unfortunately doesnt have an API for non-wordpress sites, like Kraken.io


    This post was edited by playmusician at April 21, 2020 12:04 AM EDT
    • 265 posts
    April 21, 2020 3:15 PM EDT

    the goal is to manage the size of stored uploads (AWS S3) of images by users.

    It can get enormous and by optimising the size before its saved, it could reduce storage space.

     

    Sorry, that's not what the solution I proposed does - it access the images, then removes all the unnecessary content and resizes them so they are the correct size for where they are being displayed and caches the result so that the 'smushing' only happens once. However, the original images are still stored exactly as they were uploaded (and modified). It saves bandwidth and enhances the speed of the page though.

     

    What you would require is modifying the images either as they are uploaded or after they are uploaded.  How are you currently storing the images? CDN, local files, etc?

     

    One way of achieving what you want without too much effort or modifying the code would be to modify the files in place. e.g. optimise the JPEG or whatever, rather than reencoding it to a WEBP or something - essentially you would just be reducing the file size. You would run a script that would find all the files, modify them, then save the result and use a cronjob to run it every 24 hours or so to apply the fixes to the new files.

     

    How many images and how big are you talking?

     

    How many thumbnails are the third-party plugins storing?

     

    What steps are you/the plugins taking to reduce the file size as they are being uploaded?

    • 275 posts
    April 26, 2020 12:18 AM EDT
    abuk said:

    the goal is to manage the size of stored uploads (AWS S3) of images by users.

    It can get enormous and by optimising the size before its saved, it could reduce storage space.

     

    Sorry, that's not what the solution I proposed does - it access the images, then removes all the unnecessary content and resizes them so they are the correct size for where they are being displayed and caches the result so that the 'smushing' only happens once. However, the original images are still stored exactly as they were uploaded (and modified). It saves bandwidth and enhances the speed of the page though.

     

     

     

     > Thanks. how does that save storage space cause you have the originals and a cached copy. Wouldnt it break the plugins photo software when referencing images?

     

     

    What you would require is modifying the images either as they are uploaded or after they are uploaded.  How are you currently storing the images? CDN, local files, etc?

     

     

     > Planning to use cloudflare as CDN , and S3 for storage

     

     

    One way of achieving what you want without too much effort or modifying the code would be to modify the files in place. e.g. optimise the JPEG or whatever, rather than reencoding it to a WEBP or something - essentially you would just be reducing the file size. You would run a script that would find all the files, modify them, then save the result and use a cronjob to run it every 24 hours or so to apply the fixes to the new files.

     

     

     

    > yes, thanks but Im trying to avoid that administrative cron chore.

    Though I guess every few months would be awesome to do that too.

    As users images would become massive.

    I am looking to smush the original image, and then save it in S3

     

     

     

    How many images and how big are you talking?

     

     

    > For a social media musicians site, it would be a lot, max size for images I guess would be 2 Mb, max upload size would be 5 Mb

     

    How many thumbnails are the third-party plugins storing?

     

     

    > this is something I do not know as Its their stuff , and very hard to get proper technical documentation on these plugins

     

    What steps are you/the plugins taking to reduce the file size as they are being uploaded?

     

     

    > the idea is to use kraken.io. Currently I do not have a solution.

    I am looking first to move storage to S3 , then use  a Lambda with Kraken.

     

    Thanks

     

     

     

     

     


    This post was edited by playmusician at April 26, 2020 12:21 AM EDT
    • 265 posts
    April 26, 2020 1:14 PM EDT

     Thanks. how does that save storage space cause you have the originals and a cached copy. Wouldnt it break the plugins photo software when referencing images?

    It doesn't save space...it uses more. What it does save is bandwidth between the server and the client as the images served are only the size that is needed for them to display. It wouldn't break anything as it is essentially a 'cached' copy that edits the filenames etc. referred to so that the cached copy refers only the cached image...and the original image is not interfered with.

     

    I am not au fait with S3 and CDN with Social Engine, so will bow out of any queries with regards to that.

     

    I know that the album_photos plugin included with Social Engine does not store the original image uploaded, only a 700x700 version and  (from memory) a 250x250 version for thumbnails, and interestingly the thumbnails are barely used - e.g. in the feed the 700x700 version is used.