Human Generated Data

Title

Untitled (wedding guests seated on lawn)

Date

1953

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8744

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (wedding guests seated on lawn)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1953

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8744

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Furniture 99.9
Person 99.6
Human 99.6
Grass 98.9
Plant 98.9
Person 98.8
Person 98.6
Person 98.4
Person 98.3
Person 96.4
Outdoors 94.4
Person 93.8
Person 91.6
Park 87.2
Lawn 87.2
People 83.8
Nature 81.2
Shorts 80.2
Clothing 80.2
Apparel 80.2
Chair 77.4
Chair 74.2
Tree 70.4
Female 70.4
Crowd 69.6
Suit 69.2
Coat 69.2
Overcoat 69.2
Kid 65.6
Child 65.6
Girl 64.3
Field 64.2
Photography 60.8
Photo 60.8
Text 60.6
Yard 59.3
Play 58.2
Path 56.8
Sand 55
Person 53.1

Clarifai
created on 2023-10-25

people 99.9
group 98.6
adult 97.2
many 96.5
group together 95.8
man 94.9
canine 93.9
child 93.6
administration 89.5
dog 87.9
military 86.7
wear 86.2
leader 85.8
campsite 84.6
education 83.3
boy 82.9
woman 82.8
cavalry 82.7
war 80.5
school 80.3

Imagga
created on 2022-01-09

sport 42.9
people 22.3
beach 22
travel 21.8
sky 21.7
water 20
vacation 18.8
landscape 18.6
outdoors 17.4
summer 17.4
sand 17.3
ocean 16.8
silhouette 16.6
sunset 16.2
man 16.1
walking 15.2
sea 14.9
child 14.6
person 13.7
mountain 13.5
couple 13.1
lifestyle 13
tourism 12.4
active 11.9
stone 11.8
world 11.8
coast 11.7
outdoor 11.5
old 11.1
swimming trunks 11
male 10.7
athlete 10.6
hill 10.3
runner 10.1
field 10
family 9.8
group 9.7
sun 9.7
tourist 9.7
rock 9.6
walk 9.5
spectator 9.5
swimsuit 9.3
clouds 9.3
waves 9.3
leisure 9.1
sprinkler 8.8
scenic 8.8
grass 8.7
hiking 8.7
holiday 8.6
architecture 8.6
relax 8.4
happy 8.1
river 8
together 7.9
run 7.7
running 7.7
dusk 7.6
fun 7.5
desert 7.5
action 7.4
mechanical device 7.4
lake 7.3
road 7.2
scenery 7.2
adult 7.2
recreation 7.2
history 7.2
sunlight 7.1
season 7
garment 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

outdoor 96.3
text 95.7
person 93.6
clothing 82.6
man 82.5
people 56

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Female, 94.1%
Happy 58.5%
Calm 40%
Sad 0.7%
Confused 0.3%
Fear 0.2%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 45-51
Gender Female, 96.6%
Happy 77.1%
Calm 13.4%
Sad 3.8%
Confused 1.5%
Surprised 1.3%
Fear 1.2%
Disgusted 0.8%
Angry 0.8%

AWS Rekognition

Age 33-41
Gender Female, 55.6%
Calm 99.8%
Sad 0%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%
Fear 0%
Surprised 0%

AWS Rekognition

Age 16-24
Gender Female, 64.7%
Calm 86.9%
Sad 9.1%
Happy 1.2%
Confused 0.9%
Fear 0.7%
Disgusted 0.5%
Surprised 0.3%
Angry 0.3%

AWS Rekognition

Age 34-42
Gender Male, 97.5%
Calm 77.4%
Happy 10.9%
Sad 7.2%
Confused 2.5%
Disgusted 0.6%
Surprised 0.6%
Fear 0.4%
Angry 0.3%

AWS Rekognition

Age 38-46
Gender Male, 99.4%
Sad 81%
Confused 17.2%
Calm 0.5%
Disgusted 0.4%
Angry 0.3%
Happy 0.3%
Surprised 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Chair 77.4%

Text analysis

Amazon

38607
BAD
KODA

Google

LO98 E
LO98
E