Human Generated Data

Title

Untitled (uniformed men reading in room with mural of a flying plane, Olmstead Airfield, Middleton, PA)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4623

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (uniformed men reading in room with mural of a flying plane, Olmstead Airfield, Middleton, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.1
Human 99.1
Person 96.8
Person 95.9
Person 89
Person 87.4
Workshop 86.3
Clinic 81
People 76.6
Transportation 75.3
Vehicle 75.3
Airplane 75.3
Aircraft 75.3
Room 74.4
Indoors 74.4
Art 70.1
Furniture 69
Photography 61.4
Photo 61.4
Person 58.3
Person 57.4
Living Room 56.8
Table 56.3

Imagga
created on 2021-12-14

architecture 18
sky 17.2
building 17
city 16.6
man 16.1
house 14.2
newspaper 13.7
people 13.4
business 13.4
water 12.7
product 12.6
old 12.5
negative 12.3
marimba 12.1
musical instrument 11.6
black 11.4
travel 11.3
men 11.2
life 10.7
male 10.6
film 10.6
percussion instrument 10.5
urban 10.5
outdoors 10.4
scene 10.4
work 10.3
construction 10.3
day 10.2
person 10
transportation 9.9
landscape 9.7
daily 9.5
outdoor 9.2
history 8.9
lifestyle 8.7
cityscape 8.5
creation 8.4
vacation 8.2
blackboard 8.1
light 8.1
room 8.1
table 7.9
color 7.8
industry 7.7
roof 7.6
power 7.6
relaxation 7.5
sign 7.5
photographic paper 7.5
silhouette 7.4
tourism 7.4
group 7.2
adult 7.2
team 7.2
chair 7.1
working 7.1

Google
created on 2021-12-14

Photograph 94.2
White 92.2
Black 89.7
Chair 86.1
Black-and-white 86.1
Style 84
Line 81.7
Art 80.8
Adaptation 79.4
Tints and shades 77
Monochrome 76.7
Painting 76.2
Monochrome photography 76
Snapshot 74.3
Font 71.2
Room 69.7
Event 68.9
Stock photography 63.6
Table 63.1
Visual arts 62.9

Microsoft
created on 2021-12-14

text 99.8
furniture 91
black and white 89.9
table 88.2
chair 77.7
person 76.9
house 70.5

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Female, 56.7%
Calm 79.9%
Happy 12.8%
Sad 4.5%
Surprised 1.4%
Fear 0.5%
Confused 0.3%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 43-61
Gender Female, 53.3%
Sad 93.9%
Calm 5.6%
Confused 0.2%
Angry 0.1%
Fear 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 4-14
Gender Male, 64%
Calm 90.7%
Happy 6.4%
Sad 2%
Surprised 0.4%
Angry 0.3%
Confused 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 50-68
Gender Male, 82.6%
Calm 79.1%
Sad 18.1%
Surprised 0.9%
Confused 0.8%
Happy 0.6%
Angry 0.2%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 50-68
Gender Male, 97.6%
Calm 92.3%
Happy 2.8%
Angry 1.6%
Sad 1.4%
Surprised 1.2%
Confused 0.3%
Disgusted 0.2%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Airplane 75.3%

Captions

Microsoft

calendar 15.9%

Text analysis

Amazon

20517.