Human Generated Data

Title

Untitled (men unloading cargo from airplane, Olmstead Airfield, Pennsylvania)

Date

1942

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11785

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men unloading cargo from airplane, Olmstead Airfield, Pennsylvania)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1942

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11785

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 99.8
Human 99.8
Person 99.6
Person 88.6
Airplane 83.7
Aircraft 83.7
Transportation 83.7
Vehicle 83.7
Building 73.5
Person 73.2
Person 65.5
Pottery 59
Architecture 58.2
Table 57.9
Furniture 57.9
Meal 55
Food 55

Clarifai
created on 2023-10-25

people 99.4
aircraft 97
airplane 96.2
group together 94.1
woman 93.8
man 92.5
group 92.1
adult 91.9
transportation system 90.3
vehicle 90
child 89.9
light 89.9
art 86.8
monochrome 85.9
military 85.6
chair 85.1
music 84.9
silhouette 84.4
furniture 83.9
travel 83.5

Imagga
created on 2022-01-15

black 22.3
man 18.3
computer 17.9
office 17
monitor 15.9
business 15.8
people 15
silhouette 14.9
laptop 14.7
technology 14.1
equipment 13.9
screen 13.3
male 12.8
work 12.5
desk 11.6
person 11.6
device 11.3
television 11
background 11
light 10.7
night 10.6
modern 10.5
glass 10.1
vintage 9.9
vehicle 9.8
working 9.7
businessman 9.7
film 9.6
center 9.1
display 9
freight car 8.9
color 8.9
car 8.6
sky 8.3
wheeled vehicle 8.3
businesswoman 8.2
sitting 7.7
musical instrument 7.6
art 7.5
style 7.4
retro 7.4
entertainment 7.4
telecommunication system 7.3
tie 7.3
digital 7.3
music 7.2
stage 7.2
machine 7.2
suit 7.2
adult 7.2
worker 7.1
job 7.1
architecture 7

Microsoft
created on 2022-01-15

text 99.9
black and white 93.3
indoor 90.7
concert 89.1
monochrome 60.6
watching 50.9
image 32.9
display 26.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Male, 79.1%
Calm 40.2%
Fear 14.1%
Disgusted 13.8%
Sad 13.6%
Confused 9.5%
Happy 5%
Surprised 2.3%
Angry 1.4%

AWS Rekognition

Age 36-44
Gender Male, 95.1%
Sad 46.4%
Calm 18.7%
Angry 10.5%
Fear 8.4%
Confused 5.4%
Disgusted 5.3%
Surprised 3.1%
Happy 2.3%

AWS Rekognition

Age 16-22
Gender Female, 55.8%
Fear 43%
Happy 22.2%
Calm 9.4%
Surprised 8.6%
Sad 7.8%
Angry 4.5%
Disgusted 3.1%
Confused 1.4%

Feature analysis

Amazon

Person 99.8%
Airplane 83.7%

Text analysis

Amazon

20504.

Google

20504.
20504.