Human Generated Data

Title

Untitled (Baldwin students walking in a line around long table)

Date

1937

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8452

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Baldwin students walking in a line around long table)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8452

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.4
Human 99.4
Person 98.8
Person 98.7
Person 97
Clothing 96.9
Apparel 96.9
Person 93.5
Crowd 88.1
Person 88.1
Face 86.3
Female 77
People 73.2
Person 71.8
Audience 67.9
Woman 63.7
Furniture 62.9
Person 62.7
Photography 62.6
Photo 62.6
Portrait 62.6
Indoors 62.4
Sitting 62.3
Overcoat 62.3
Suit 62.3
Coat 62.3
Text 58.5
Finger 55.6
Dress 55.3
Couch 55.2
Person 52.7

Clarifai
created on 2023-10-26

people 99.8
music 98.7
woman 97.8
group 97.4
adult 96.6
many 96.4
dancing 95.9
man 95.8
group together 94.2
dancer 90.9
musician 90.3
wear 89.4
administration 87.9
theater 87.2
audience 86
child 83.5
jazz 82.9
education 82.9
singer 82.8
instrument 82.1

Imagga
created on 2022-01-15

people 22.3
musical instrument 18.8
person 18.4
silhouette 16.5
brass 16.5
bride 16.4
dress 16.2
adult 15.9
black 15.2
wind instrument 14.7
male 14.2
man 14.1
couple 13.9
love 12.6
group 12.1
wedding 11.9
clothing 11.7
model 11.7
men 11.2
dark 10.8
groom 10.7
happy 10.6
flowers 10.4
dance 10.1
fashion 9.8
portrait 9.7
bouquet 9.7
party 9.4
happiness 9.4
lifestyle 9.4
dancer 9.3
stage 9.2
pretty 9.1
attractive 9.1
art 8.8
celebration 8.8
elegance 8.4
human 8.2
girls 8.2
women 7.9
boy 7.8
summer 7.7
married 7.7
grunge 7.7
life 7.7
sky 7.6
outdoor 7.6
hand 7.6
bathing cap 7.6
sport 7.5
leisure 7.5
evening 7.5
performer 7.4
style 7.4
event 7.4
sexy 7.2
sunset 7.2
spectator 7.2
activity 7.2
night 7.1
together 7

Google
created on 2022-01-15

Black 89.6
Chair 87.8
Coat 85.6
Black-and-white 83.9
Suit 79.1
Art 77.6
Monochrome 77
Monochrome photography 76.7
Vintage clothing 76.5
Font 74.5
Event 72.1
Room 71.2
Table 67.9
Photo caption 65.5
Stock photography 65.2
History 63.5
Window 62.9
Classic 60.1
Team 59.9
Illustration 57.5

Microsoft
created on 2022-01-15

text 99.4
person 96.6
clothing 83.7
black and white 77.7
concert 74.5
crowd 0.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 42-50
Gender Male, 99.1%
Sad 65.2%
Calm 19.1%
Surprised 4%
Confused 3.1%
Happy 2.9%
Angry 2.7%
Fear 1.8%
Disgusted 1.3%

AWS Rekognition

Age 33-41
Gender Male, 98.5%
Calm 99.9%
Happy 0%
Confused 0%
Disgusted 0%
Sad 0%
Surprised 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 100%
Calm 92.6%
Surprised 5.6%
Angry 0.6%
Sad 0.4%
Disgusted 0.3%
Happy 0.3%
Fear 0.2%
Confused 0.1%

AWS Rekognition

Age 34-42
Gender Male, 92.9%
Sad 69.8%
Calm 18.1%
Happy 4.8%
Disgusted 2%
Angry 1.9%
Confused 1.9%
Fear 0.8%
Surprised 0.8%

AWS Rekognition

Age 23-33
Gender Male, 98.9%
Calm 84.7%
Disgusted 3.9%
Angry 3.1%
Surprised 2%
Confused 1.9%
Happy 1.7%
Sad 1.6%
Fear 1.1%

AWS Rekognition

Age 35-43
Gender Male, 89.1%
Calm 86%
Happy 7.2%
Disgusted 2.2%
Surprised 1.6%
Fear 1.1%
Confused 0.9%
Sad 0.6%
Angry 0.4%

AWS Rekognition

Age 20-28
Gender Male, 77.3%
Sad 95.8%
Calm 1.9%
Happy 1.1%
Fear 0.4%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Surprised 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.4%

Text analysis

Amazon

830N3330
YT33AS 830N3330
YT33AS
wood

Google

YT3RA2
A303330
300t YT3RA2 A303330
300t