Human Generated Data

Title

Experimental photograph testing developer

Date

-

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Straus Center for Conservation and Technical Studies, Gift of John Erdman and Gary Schneider, 2017.97.5.1

Human Generated Data

Title

Experimental photograph testing developer

Classification

Photographs

Credit Line

Harvard Art Museums/Straus Center for Conservation and Technical Studies, Gift of John Erdman and Gary Schneider, 2017.97.5.1

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Plant 92.7
Jar 92.2
Pottery 92.2
Vase 92.2
Human 90.9
Blossom 83.6
Flower 83.6
Person 83.4
Finger 75.3
Potted Plant 71.5
Flower Arrangement 69.3
Apparel 58.7
Clothing 58.7

Clarifai
created on 2018-02-19

people 98.3
adult 96.5
one 95.8
man 95.2
wear 93
woman 92.6
portrait 91.8
monochrome 84.7
furniture 84.1
indoors 83.5
business 81.3
fashion 80.4
girl 79.5
facial expression 79.4
paper 79.4
flower 79.1
child 77.8
retro 77.2
room 76
two 75.7

Imagga
created on 2018-02-19

bag 44.2
black 43.9
man 26.2
container 25
mailbag 22.8
sweatshirt 22.6
model 20.2
clothing 20.2
hand 19.7
male 18.4
person 18.4
garment 18.3
pullover 18.1
people 17.8
body 17.6
adult 16.8
businessman 16.8
business 14.6
fashion 14.3
portrait 14.2
sweater 13.7
studio 13.7
human 13.5
sexy 12.8
suit 12.4
hands 12.2
face 12.1
covering 12
hair 11.9
dark 11.7
holding 11.5
backpack 11.2
attractive 11.2
women 11.1
hat 10.9
close 10.8
style 10.4
love 10.3
back 10.1
skin 9.3
dress 9
lady 8.9
worker 8.9
standing 8.7
elegant 8.6
tie 8.5
pretty 8.4
office 8
posing 8
job 8
work 7.8
erotic 7.7
mask 7.7
formal 7.6
casual 7.6
elegance 7.6
sensual 7.3
sensuality 7.3
success 7.2
looking 7.2

Google
created on 2018-02-19

Microsoft
created on 2018-02-19

indoor 95
person 90.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 96.8%
Surprised 0.7%
Disgusted 88.3%
Confused 1.7%
Happy 0.5%
Calm 1.5%
Sad 0.7%
Angry 6.6%

Feature analysis

Amazon

Person 83.4%

Captions

Microsoft
created on 2018-02-19

a man sitting on a table 35.2%
a man sitting at a table 35.1%
a person sitting on a table 35%

Azure OpenAI

Created on 2024-11-18

You are looking at a black and white photograph that features a person with their eyes closed, appearing to be resting or perhaps posing as if asleep. The person is lying on their side on a flat surface, with their head resting on an outstretched arm. In the foreground, there is a vase with a floral arrangement on the left, and at the bottom right, you can see some items arranged next to the person's other hand: a wristwatch, a pen, and a piece of paper with what looks like a drawing or diagram. The backdrop and the surface the person is lying on are covered with a dark cloth, which provides a stark contrast to the white vase and the lighter-toned objects. The image conveys a serene and somewhat artistic composition, combining elements which could symbolize time (the wristwatch), creativity or work (the pen and paper), and life or growth (the flowers). The choice of black and white enhances the textural details and lends the image a timeless quality.

Anthropic Claude

Created on 2024-11-18

The image shows a black and white photograph depicting a person reclining on a couch with a vase of flowers beside them. The person's face is visible, and they appear to be sleeping or resting with their eyes closed. The photograph has a somber, contemplative mood, with the contrast between the stark black and white tones and the delicate flowers creating a sense of solemnity and introspection.