Human Generated Data

Title

Untitled (Baldwin alumni in costumes standing around a rug)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8453

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Baldwin alumni in costumes standing around a rug)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8453

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.5
Human 99.5
Person 99.1
Person 99.1
Clothing 98.3
Apparel 98.3
Person 98.2
Person 98.1
Person 98
Person 97.2
Person 94.6
Person 88.1
Person 80.2
Costume 74.5
Crowd 69.6
People 66.2
Face 66.1
Female 65.6
Advertisement 64.6
Airplane 61.8
Transportation 61.8
Vehicle 61.8
Aircraft 61.8
Text 61.2
Photography 60.4
Photo 60.4
Outdoors 58.4
Coat 57.6
Poster 55.8
Person 55.5
Performer 55.2

Clarifai
created on 2023-10-26

people 99.9
group 97.8
music 97.5
wear 96.4
man 96
adult 94.6
woman 92.4
theater 92.3
leader 92
musician 91.1
group together 90.4
outfit 90.3
singer 89.2
child 88.8
administration 88.7
actress 86.5
several 86.5
actor 84.8
five 83.7
many 81.7

Imagga
created on 2022-01-15

people 25.1
man 23.5
person 22.4
male 22
adult 17.7
stage 16.3
men 16.3
portrait 15.5
dark 13.4
fashion 12.1
love 11.8
outdoor 11.5
outdoors 11.2
groom 11
platform 10.9
sport 10.9
silhouette 10.8
bride 10.5
fun 10.5
couple 10.4
old 10.4
women 10.3
city 10
dress 9.9
active 9.7
group 9.7
black 9.6
happy 9.4
dance 9.3
action 9.3
human 9
sky 8.9
happiness 8.6
model 8.5
two 8.5
world 8.3
wedding 8.3
symbol 8.1
team 8.1
clothing 7.8
player 7.7
outside 7.7
star 7.6
vacation 7.4
water 7.3
freedom 7.3
business 7.3
sexy 7.2
lifestyle 7.2
work 7.2
building 7.1
hair 7.1
face 7.1
statue 7

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99.6
clothing 95.2
person 91.5
man 84.3
posing 78
black and white 76.9
poster 64
art 55.3
old 46
clothes 25.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Male, 97.6%
Calm 84%
Sad 9.7%
Surprised 3.1%
Confused 1.2%
Angry 0.6%
Disgusted 0.6%
Fear 0.5%
Happy 0.3%

AWS Rekognition

Age 40-48
Gender Male, 91.3%
Calm 64.8%
Happy 21.3%
Disgusted 3.9%
Confused 2.6%
Fear 2.6%
Surprised 1.8%
Sad 1.7%
Angry 1.2%

AWS Rekognition

Age 47-53
Gender Male, 99.9%
Calm 99.3%
Sad 0.2%
Surprised 0.2%
Confused 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 24-34
Gender Male, 98.1%
Happy 47.5%
Calm 40%
Surprised 9.8%
Disgusted 1.1%
Confused 0.6%
Sad 0.5%
Angry 0.3%
Fear 0.2%

AWS Rekognition

Age 18-24
Gender Male, 82%
Sad 97.3%
Calm 2.5%
Fear 0.1%
Surprised 0%
Angry 0%
Disgusted 0%
Confused 0%
Happy 0%

AWS Rekognition

Age 27-37
Gender Male, 98.5%
Sad 84.5%
Calm 6.4%
Angry 5.9%
Confused 1.9%
Happy 0.5%
Surprised 0.5%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Airplane 61.8%
Poster 55.8%

Categories

Text analysis

Amazon

100E
-100E
830N3730
EEI

Google

100€-
100€-