Human Generated Data

Title

Untitled (men in bowling alley, Cloister Inn Club Dinner, Haverford, PA)

Date

1939

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8289

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men in bowling alley, Cloister Inn Club Dinner, Haverford, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8289

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.4
Human 99.4
Person 99.3
Person 99.2
Person 98.6
Person 98.1
Person 97.9
Person 97.7
Clothing 97.1
Apparel 97.1
Person 96.6
Floor 94.2
Flooring 92.7
Person 91.4
Person 75.5
Wood 75.2
Female 74.6
Furniture 71.8
Face 69.7
Leisure Activities 67.3
Undershirt 65.3
Overcoat 65
Suit 65
Coat 65
Meal 64.6
Food 64.6
Photo 64.6
Photography 64.6
Indoors 63.3
Dance Pose 61.2
Shorts 59
Pants 57.5
Chair 56.9
Art 56.9
Drawing 56.9
Dance 56.8
Woman 56.2

Clarifai
created on 2023-10-25

people 99.8
man 97.4
group 96.3
woman 95.6
group together 95
education 94.8
adult 94.3
monochrome 94.1
dancing 93.1
music 93.1
dancer 90.2
indoors 89.6
school 87.1
rehearsal 83.1
child 82.8
musician 81.7
recreation 81
teacher 80.4
actor 79.7
leader 75.3

Imagga
created on 2022-01-08

people 36.8
teacher 25.5
business 24.9
person 23.7
man 22.8
group 22.6
women 22.1
adult 21.9
men 21.5
male 20.6
urban 20.1
silhouette 19
professional 18.3
city 18.3
crowd 17.3
room 16.5
modern 16.1
musical instrument 16.1
businessman 15.9
chair 15.8
table 15.6
office 15.6
life 15.4
educator 15.4
interior 15
corporate 14.6
sitting 13.7
hall 13.7
airport 13.7
classroom 13
inside 12.9
window 12.8
indoor 12.8
building 12.7
walk 12.4
indoors 12.3
meeting 12.2
glass 11.7
transportation 11.7
performer 11.6
wind instrument 11.3
portrait 11
work 11
black 10.9
lifestyle 10.8
dancer 10.8
travel 10.6
walking 10.4
architecture 10.2
gate 10.1
happy 10
world 10
outfit 9.9
team 9.9
station 9.8
motion 9.4
light 9.4
floor 9.3
suit 9.1
human 9
reflection 8.9
job 8.8
corridor 8.8
entertainer 8.8
couple 8.7
boy 8.7
standing 8.7
scene 8.7
move 8.6
percussion instrument 8.5
marimba 8.4
fashion 8.3
executive 8.3
holding 8.3
board 8.1
activity 8.1
success 8
shop 8
passenger 8
hallway 7.9
subway 7.9
brass 7.9
hands 7.8
mall 7.8
elegance 7.6
stage 7.5
journey 7.5
fun 7.5
leisure 7.5
employee 7.4
transport 7.3
sunset 7.2
to 7.1
day 7.1
happiness 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.9
wedding dress 97.2
person 95.8
clothing 94.5
bride 94
dress 93.2
woman 92.4
man 88.2
standing 79.6
dance 79.3
wedding 71.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 99.7%
Calm 82.6%
Sad 15.6%
Surprised 0.8%
Confused 0.4%
Angry 0.3%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Male, 97.2%
Sad 72.3%
Happy 19.7%
Calm 3.9%
Fear 1.1%
Disgusted 0.8%
Surprised 0.8%
Confused 0.8%
Angry 0.7%

AWS Rekognition

Age 38-46
Gender Female, 86.5%
Calm 80.2%
Sad 14.1%
Confused 1.6%
Surprised 1.4%
Disgusted 1.2%
Happy 0.6%
Angry 0.6%
Fear 0.2%

AWS Rekognition

Age 40-48
Gender Male, 99.5%
Calm 99.8%
Sad 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 36-44
Gender Male, 61.4%
Calm 98.5%
Sad 1.1%
Fear 0.2%
Angry 0.1%
Disgusted 0.1%
Surprised 0%
Confused 0%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Chair 56.9%

Categories

Text analysis

Amazon

9538
9538.
MU3
YTERA2
MU3 YTERA2 APPA
APPA

Google

9538
9538.
9538 9538.