Human Generated Data

Title

Untitled (men unloading cargo from plane at Olmstead Airfield)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7160

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men unloading cargo from plane at Olmstead Airfield)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7160

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 99.3
Person 97
Airplane 93.1
Aircraft 93.1
Vehicle 93.1
Transportation 93.1
Person 92.2
Person 71.3
Machine 65.6
Train 63.6
People 62.4
Wheel 58.1

Clarifai
created on 2023-10-15

aircraft 99.4
airplane 99.2
people 99.2
transportation system 98.9
military 98.5
vehicle 98
war 96.9
group together 96.6
man 91.2
group 90.6
airport 89.8
many 89.8
aviate 89.7
navy 89.3
monochrome 87.9
adult 85.7
warship 84.8
air force 84.2
indoors 81
several 79.4

Imagga
created on 2021-12-15

equipment 31.3
interior 28.3
seat 24.7
room 22
chair 20.8
device 20.7
table 19.5
modern 18.2
indoors 17.6
business 17
technology 15.6
furniture 15.3
center 14.8
support 14.8
cockpit 14.2
computer 13.2
design 12.9
monitor 12.6
glass 12.4
machine 11.9
health 11.8
reflection 11.5
people 11.2
inside 11
office 11
steel 10.6
gym 10.5
plane seat 10.5
club 10.4
floor 10.2
man 10.1
transportation 9.9
kitchen 9.8
digital 9.7
new 9.7
stage 9.5
empty 9.4
light 9.4
window 9.3
3d 9.3
training 9.2
clean 9.2
shop 9.1
electronic equipment 9.1
work 8.6
automobile 8.6
men 8.6
car 8.6
industry 8.5
studio 8.4
indoor 8.2
restaurant 8.1
classroom 8
nobody 7.8
vehicle 7.7
train 7.7
auto 7.7
display 7.6
desk 7.5
horizontal 7.5
strength 7.5
row 7.5
occupation 7.3
keyboard 7.3
exercise 7.3
working 7.1
architecture 7

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.5
black and white 81.9
person 79.2
clothing 69.7
concert 61.7
several 10.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 49-67
Gender Male, 53.5%
Calm 97.4%
Happy 0.9%
Confused 0.7%
Surprised 0.4%
Sad 0.3%
Angry 0.2%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 49-67
Gender Male, 69.4%
Calm 82.4%
Sad 7.5%
Surprised 3.3%
Confused 3.1%
Angry 1.3%
Disgusted 1.3%
Happy 0.7%
Fear 0.4%

AWS Rekognition

Age 20-32
Gender Female, 80.5%
Fear 86.5%
Calm 6.3%
Sad 2.2%
Confused 1.5%
Surprised 1.5%
Angry 1%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 42-60
Gender Female, 97.6%
Sad 42.7%
Calm 39%
Surprised 7%
Confused 6.4%
Fear 1.8%
Happy 1.5%
Angry 1%
Disgusted 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Airplane 93.1%
Train 63.6%
Wheel 58.1%

Categories

Captions

Text analysis

Amazon

20506A.
MAMT
VAGON MAMT
VAGON