Human Generated Data

Title

Untitled (men working on airplane engine, Olmstead Airfield, Pennsylvania)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11783

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men working on airplane engine, Olmstead Airfield, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11783

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.6
Human 99.6
Person 99.6
Clothing 89.3
Apparel 89.3
Collage 85
Advertisement 85
Poster 85
Art 79.7
Airplane 73.9
Transportation 73.9
Vehicle 73.9
Aircraft 73.9
Suit 59.5
Coat 59.5
Overcoat 59.5
Drawing 58.8
Sculpture 56.6
Female 55.1

Clarifai
created on 2023-10-25

people 99.8
group 98.4
wear 97
adult 96.8
man 96.7
military 96.6
vehicle 96
interaction 94.7
group together 94.3
science 93
one 93
two 91.3
outfit 91.2
many 90.3
several 89.7
illustration 89.5
leader 89.4
uniform 87.7
war 86.3
education 86.2

Imagga
created on 2022-01-15

man 20.1
person 18.3
male 17
black 15.6
astronaut 14.8
art 14.7
grunge 13.6
music 13.5
old 13.2
people 12.8
adult 12.3
retro 12.3
shop 11.9
clock 11.7
world 11.6
light 10.7
business 10.3
city 10
player 10
vintage 9.9
team 9.8
businessman 9.7
decoration 9.7
men 9.4
guitar 9.4
stage 9.3
toyshop 9.2
time 9.1
studio 9.1
musician 9.1
graffito 8.9
technology 8.9
style 8.9
night 8.9
mercantile establishment 8.7
play 8.6
modern 8.4
dance 8.4
dark 8.3
sport 8.3
star 8.3
fun 8.2
bass 8.2
hand 8.1
symbol 8.1
metal 8
drawing 7.9
rock 7.8
musical 7.7
stand 7.6
historical 7.5
sign 7.5
sculpture 7.5
musical instrument 7.4
success 7.2
history 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

text 99
black and white 92
drawing 91.6
cartoon 62.3
art 61.1
street 60.8
monochrome 58.8
person 55.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 30-40
Gender Male, 99.5%
Calm 43.3%
Sad 43.1%
Confused 4.3%
Fear 3.1%
Surprised 2.7%
Disgusted 1.4%
Angry 1.4%
Happy 0.7%

AWS Rekognition

Age 31-41
Gender Female, 54.1%
Calm 76.2%
Sad 17.4%
Happy 3.1%
Disgusted 1.1%
Surprised 0.7%
Fear 0.6%
Confused 0.5%
Angry 0.5%

Feature analysis

Amazon

Person 99.6%
Airplane 73.9%

Text analysis

Amazon

20497A
20499A
NOUVR

Google

20497A ০ ।
20497A