Artist’s obsession with depicting humans and even replacing or reimaging humans is not a new trend, it is visible in art throughout its history. With the new Artificial Intelligence (AI) art generators like Dall-E and Midjourney there is democratisation of art and its availability; and with it, a downpour of images depicting humans who are not real but created through text prompts. The depiction and sexual objectification of women in art has been examined and this research seeks to understand the same in AI generated images. Platforms like Instagram have provided a space for exhibition of AI generated images and content creators use such platforms to express themselves and earn through them. So, are these content creators’ trained artists or are the content creator’s art enthusiasts or is the artist the AI? Are they male, female, or non-binary? Many questions are raised with respect to who is creating images that sexually objectify women. An important point to note here is the data base of images that exist in the public domain that train the art generators. This research uses a multimodal method for data collection and analysis of images that depict the sexual objectification of women on platforms like Instagram to better understand the styles, themes, characteristics, and creators that inform this kind of AI generated art. When juxtaposed against images of women created by artists on AI art generators, reveals an understanding of intent of the artist vs the creator vs AI. This research will help outline current trends in such imagery and locate it through stereotypes that exist with the male gaze in mind. Relevance of this research lies in understanding intent of these AI art generators and the stereotypes that exist with respect to objectification of women.