Human Generated Data

Title

Untitled (circus performers standing in front of train)

Date

c. 1940

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7633

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performers standing in front of train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1940

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7633

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Clothing 100
Apparel 100
Human 99.5
Person 99.5
Coat 98.2
Overcoat 98.2
Person 98
Person 96.1
Suit 86
Hat 68.7
Shorts 65.4
Sun Hat 58.4
Tire 55.1

Clarifai
created on 2023-10-25

people 99.9
woman 97.5
adult 97.4
wear 96.6
veil 95.5
group 95.2
man 94.6
lid 94.6
child 93.5
outerwear 93.1
street 92.3
monochrome 92
two 91.6
coat 91.2
three 87.4
group together 87.1
portrait 84.5
outfit 83.9
art 83.3
administration 83.2

Imagga
created on 2022-01-08

crutch 92
staff 72.1
stick 53.6
people 26.8
old 18.1
man 17.5
city 15.8
adult 15
person 14.4
travel 14.1
dress 12.6
walking 12.3
male 12.1
men 11.2
women 11.1
art 11.1
religion 10.8
world 10.6
life 10.2
black 10.2
dark 10
outdoor 9.9
statue 9.7
urban 9.6
scene 9.5
silhouette 9.1
kin 8.9
group 8.9
mother 8.8
holiday 8.6
culture 8.5
religious 8.4
window 8.3
street 8.3
tourism 8.2
human 8.2
park 8.2
business 7.9
couple 7.8
standing 7.8
horror 7.8
sculpture 7.6
two 7.6
monument 7.5
tradition 7.4
tourist 7.3
history 7.2
portrait 7.1
family 7.1
love 7.1
happiness 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

black and white 97.6
clothing 96.9
person 95.3
text 95.1
outdoor 88
monochrome 87.6
street 84.2
man 68.6
statue 54.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 31-41
Gender Male, 61.5%
Sad 50.6%
Calm 24.6%
Surprised 11.1%
Fear 5.2%
Happy 3.7%
Angry 2.6%
Confused 1.2%
Disgusted 1%

AWS Rekognition

Age 31-41
Gender Female, 84.9%
Calm 59.7%
Fear 27.3%
Sad 5.5%
Surprised 2.9%
Happy 2.7%
Disgusted 1%
Angry 0.5%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

2