Human Generated Data

Title

Untitled (Mask & Wig performers on stage)

Date

1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7465

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (Mask & Wig performers on stage)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7465

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.6
Human 99.6
Person 99.6
Person 98.1
Person 98
Person 98
Person 98
Person 97.8
Person 96.1
Person 94.4
Person 93.4
Clothing 86.6
Apparel 86.6
People 79.6
Figurine 68.2
Text 58.9

Clarifai
created on 2023-10-25

people 99.5
man 96.9
group 96.5
group together 96.2
woman 95.5
adult 95.1
many 93.6
indoors 91.2
illustration 89.6
athlete 88.3
squad 86.8
young 86.5
child 83.1
uniform 83
competition 81.9
adolescent 80.8
crowd 80.3
wear 80
desktop 79
education 77.4

Imagga
created on 2022-01-08

case 48.1
golfer 21.9
player 20.1
people 15.6
person 15.4
art 14.2
party 13.7
contestant 13.6
billboard 12.8
man 12.7
symbol 12.1
graphic 11.7
business 11.5
hand 11.4
male 11.3
design 11.2
men 11.1
black 10.9
night 10.6
signboard 9.6
celebration 9.6
sky 9.6
grunge 9.4
silhouette 9.1
sign 9
human 9
decoration 8.9
water 8.7
adult 8.5
structure 8.4
old 8.3
light 8
women 7.9
holiday 7.9
pretty 7.7
blackboard 7.5
landscape 7.4
drawing 7.4
reflection 7.2
dance 7.1
businessman 7.1
antique 7

Google
created on 2022-01-08

Font 82.6
Art 80.7
Facade 73.6
Statue 70.4
Monochrome photography 68.4
Rectangle 67.8
Monochrome 67.8
Fictional character 66.7
History 64.9
Painting 64.1
Visual arts 62.9
Illustration 61.8
Metal 60.7
Sculpture 59.3
Collection 57
Room 56.6
Relief 54.1
Animation 51.6

Microsoft
created on 2022-01-08

text 98.9
drawing 90.9
person 69.7
cartoon 69.5
clothing 67.7
sketch 63.2
black and white 54.9
old 50.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Female, 65.9%
Calm 68.5%
Happy 11.9%
Confused 4.8%
Disgusted 4.4%
Sad 3.4%
Surprised 3.1%
Angry 2.8%
Fear 1.1%

AWS Rekognition

Age 38-46
Gender Male, 66.6%
Calm 93%
Happy 2.6%
Confused 1.3%
Sad 1.1%
Disgusted 0.9%
Angry 0.5%
Surprised 0.4%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

8613
MJI7
MJI7 YEET A70A
A70A
YEET
LA

Google

8613 8613 MJI3 YT3RA 2 A7ƏA
8613
MJI3
YT3RA
2
A7ƏA