Human Generated Data

Title

Untitled (men unloading cargo from a plane)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7155

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men unloading cargo from a plane)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7155

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.7
Human 99.7
Person 99.1
Person 98.9
Person 95.3
Person 95.1
Airplane 87.9
Aircraft 87.9
Vehicle 87.9
Transportation 87.9
Person 76.8
Machine 74.8
Biplane 69.8
Building 68.9
Outdoors 65.6
Nature 60.9
People 60.6
Person 60.2
Shoe 59.5
Clothing 59.5
Footwear 59.5
Apparel 59.5
Shoe 59.1

Clarifai
created on 2023-10-15

people 99.5
aircraft 97.2
airplane 96.7
transportation system 95.4
group together 94.9
vehicle 94.6
many 92.4
group 91.8
airport 91.3
monochrome 90.9
military 90.6
man 89.7
adult 87.6
war 84.2
indoors 83.6
wear 79.4
room 78.3
seat 76.6
watercraft 76.5
child 75.2

Imagga
created on 2021-12-15

seat 27.3
device 22.8
cockpit 22.6
car 19.1
people 19
man 18.8
support 18.6
person 16.4
equipment 16.2
transportation 16.1
male 14.9
business 14.6
work 13.3
adult 13.1
technology 12.6
automobile 12.4
working 12.4
professional 12.2
computer 11.7
black 11.5
job 11.5
indoors 11.4
hand 11.4
men 11.2
musical instrument 10.8
vehicle 10.8
interior 10.6
digital 10.5
attractive 10.5
plane seat 10.2
room 9.8
drive 9.5
occupation 9.2
portrait 9.1
cheerful 8.9
sexy 8.8
driver 8.7
lifestyle 8.7
auto 8.6
luxury 8.6
communication 8.4
transport 8.2
garage 8.2
night 8
clothing 7.9
sitting 7.7
modern 7.7
engine 7.7
repair 7.7
laptop 7.5
human 7.5
worker 7.4
close 7.4
monitor 7.4
phone 7.4
inside 7.4
tool 7.3
music 7.2
smiling 7.2
shop 7.1
happiness 7

Microsoft
created on 2021-12-15

text 99.9
clothing 92.8
person 92.6
black and white 92.4
concert 66.9
man 65.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 34-50
Gender Male, 64.4%
Calm 64.3%
Angry 13.7%
Sad 9.7%
Fear 5.1%
Happy 4.8%
Surprised 1.4%
Confused 0.7%
Disgusted 0.4%

AWS Rekognition

Age 49-67
Gender Female, 91.4%
Calm 65.2%
Sad 20.4%
Surprised 9.2%
Confused 4.1%
Happy 0.6%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 26-42
Gender Female, 75.3%
Fear 48.4%
Happy 14.8%
Confused 12.1%
Calm 10.9%
Surprised 9.1%
Sad 3.2%
Angry 1%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Airplane 87.9%
Shoe 59.5%

Text analysis

Amazon

20506
20506.

Google

20506.
20506.