Human Generated Data

Title

Untitled (young men and women seated on grass under trees)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4940

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (young men and women seated on grass under trees)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4940

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Clothing 99.9
Apparel 99.9
Person 99.4
Human 99.4
Person 99.3
Person 99.3
Person 98.8
Person 98.7
Person 98.5
Robe 98.5
Fashion 98.5
Wedding 98.4
Person 98.1
Gown 97.9
Bride 97.4
Wedding Gown 97.4
Person 95.7
Person 93.8
Person 93.1
Person 92.3
Bridegroom 92.1
Person 90.3
Person 85.2
People 85.1
Person 81.4
Suit 75.8
Overcoat 75.8
Coat 75.8
Dress 73.7
Person 73.6
Face 71.3
Person 69.8
Person 66.7
Bridesmaid 65.4
Crowd 64.2
Person 63.9
Outdoors 61.6
Female 60.6
Photography 60.2
Photo 60.2
Portrait 60
Indoors 56.4
Party 55.4
Person 53.9

Clarifai
created on 2023-10-26

people 100
many 98.2
child 97.8
group 97.7
man 97.5
adult 97.4
crowd 95.5
illustration 92
woman 91.4
dancing 91.2
art 89.8
boy 88.2
group together 87.5
music 87.3
war 84.3
wear 83.5
monochrome 79
audience 78.2
engraving 76.4
administration 76.1

Imagga
created on 2022-01-23

negative 42.4
film 35.4
sketch 31.8
drawing 30.3
photographic paper 26.6
grunge 20.4
representation 18.4
photographic equipment 17.8
art 16.3
silhouette 13.2
snow 13
winter 12.8
old 12.5
light 12
ice 12
structure 11.1
design 10.7
dance 10.5
rock 10.4
scene 10.4
style 10.4
black 10.2
people 10
water 10
city 10
carousel 9.9
park 9.8
crowd 9.6
party 9.5
dark 9.2
outdoor 9.2
vintage 9.1
dirty 9
retro 9
life 9
landscape 8.9
cool 8.9
urban 8.7
man 8.7
decoration 8.7
men 8.6
tree 8.5
power 8.4
event 8.3
mountain 8
person 8
ride 8
forest 7.8
architecture 7.8
ancient 7.8
fountain 7.6
elegance 7.6
symbol 7.4
entertainment 7.4
backgrounds 7.3
paint 7.2
history 7.2
sky 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

outdoor 95.9
text 93.9
person 75.5
clothing 67.3
man 64.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 97.4%
Happy 50.1%
Calm 20.8%
Fear 14.2%
Angry 5.1%
Disgusted 4.1%
Sad 4%
Confused 1%
Surprised 0.7%

AWS Rekognition

Age 48-56
Gender Male, 51.3%
Happy 84.8%
Calm 10.8%
Fear 1%
Surprised 0.9%
Disgusted 0.8%
Sad 0.7%
Confused 0.5%
Angry 0.4%

AWS Rekognition

Age 19-27
Gender Male, 97%
Happy 34.4%
Confused 32.5%
Calm 9.5%
Angry 7.6%
Disgusted 6.2%
Sad 5%
Surprised 3.5%
Fear 1.4%

AWS Rekognition

Age 26-36
Gender Male, 99%
Sad 48.2%
Calm 16.7%
Happy 12.4%
Disgusted 10.3%
Angry 5.8%
Confused 3.7%
Surprised 1.7%
Fear 1.2%

AWS Rekognition

Age 41-49
Gender Female, 57.4%
Happy 41%
Sad 23.8%
Disgusted 11.5%
Confused 10%
Fear 6.4%
Angry 4%
Calm 2.1%
Surprised 1.1%

AWS Rekognition

Age 28-38
Gender Female, 62.6%
Calm 69.5%
Sad 14.4%
Surprised 7.8%
Happy 2.3%
Confused 2.2%
Disgusted 1.7%
Angry 1.2%
Fear 0.9%

AWS Rekognition

Age 37-45
Gender Male, 80.2%
Happy 96.7%
Sad 2.1%
Calm 0.4%
Confused 0.3%
Surprised 0.3%
Disgusted 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 33-41
Gender Female, 93.5%
Calm 88.8%
Happy 6.4%
Surprised 2.3%
Sad 0.8%
Disgusted 0.6%
Confused 0.5%
Angry 0.3%
Fear 0.3%

AWS Rekognition

Age 36-44
Gender Male, 79.2%
Sad 63%
Fear 9.5%
Angry 7.6%
Calm 6.8%
Disgusted 4.5%
Surprised 4.3%
Happy 2.7%
Confused 1.6%

AWS Rekognition

Age 23-31
Gender Female, 61.3%
Calm 98%
Sad 0.7%
Confused 0.5%
Fear 0.3%
Disgusted 0.2%
Surprised 0.2%
Happy 0.1%
Angry 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Person 99.4%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

12634
13634

Google

13634 T2634.
13634
T2634.