Human Generated Data

Title

Untitled (sailors waiting in line at barrel)

Date

1941

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8628

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (sailors waiting in line at barrel)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8628

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.7
Person 99.7
Person 99.6
Person 99.5
Person 98.5
Person 98.1
Person 97.9
Person 96.4
Person 91.3
Clothing 90.3
Apparel 90.3
Sailor Suit 80.7
Person 77.1
People 76.3
Person 74.5
Building 70.9
Shorts 68
Officer 67.9
Military 67.9
Military Uniform 67.9
Helicopter 60.8
Transportation 60.8
Vehicle 60.8
Aircraft 60.8
Helmet 60.3
Pants 58.4
Overcoat 55.9
Coat 55.9

Clarifai
created on 2023-10-25

people 99.8
group together 98.7
adult 98.1
group 97.4
many 96.5
man 95.3
vehicle 94.3
woman 92.7
watercraft 91.7
transportation system 90.9
several 88.9
monochrome 88.6
commerce 86.4
uniform 84.1
employee 83.8
war 83.5
wear 82
military 82
three 81.5
container 81.2

Imagga
created on 2022-01-09

turnstile 50.4
gate 40.6
movable barrier 30.3
machine 26.5
container 23
industry 23
factory 22.3
industrial 21.8
milking machine 20.6
barrier 20.4
device 19.4
building 17.6
transportation 17
dairy 16.5
station 15.5
steel 15
city 14.9
work 14.9
urban 14.8
metal 14.5
power 14.3
old 13.9
architecture 13.3
pipe 13.2
men 12.9
plant 12.8
people 11.7
engineering 11.4
equipment 11
vessel 10.7
travel 10.6
structure 10.5
obstruction 10.4
business 10.3
oil 10.2
energy 10.1
inside 9.2
modern 9.1
train 9
interior 8.8
man 8.7
fuel 8.7
pollution 8.6
milk can 8.6
storage 8.6
street 8.3
pump 8.1
passenger 8
valve 8
refinery 7.9
pipes 7.9
manufacturing 7.8
machinery 7.8
scene 7.8
production 7.8
mechanical 7.8
chemical 7.7
tube 7.7
gas 7.7
engine 7.7
concrete 7.6
heavy 7.6
can 7.6
environment 7.4
water 7.3
transport 7.3
landmark 7.2
male 7.1
tank 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 98.6
person 98.2
outdoor 98
man 96.6
black and white 95.3
text 94.9
street 85.2
people 66.8
footwear 65.3
monochrome 64.7
waste container 57.2
outdoor object 44

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Female, 61.5%
Calm 99.8%
Sad 0.1%
Angry 0.1%
Surprised 0%
Confused 0%
Disgusted 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Calm 99.9%
Sad 0%
Surprised 0%
Confused 0%
Happy 0%
Angry 0%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 28-38
Gender Male, 73.3%
Calm 98.8%
Sad 0.4%
Surprised 0.3%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 42-50
Gender Male, 99.8%
Calm 53.3%
Sad 44.1%
Happy 1%
Confused 0.8%
Disgusted 0.3%
Angry 0.2%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 51-59
Gender Male, 99.1%
Calm 56%
Surprised 27.7%
Sad 7.5%
Confused 4.2%
Happy 2.2%
Disgusted 1.4%
Angry 0.8%
Fear 0.2%

AWS Rekognition

Age 47-53
Gender Male, 97.2%
Calm 98.1%
Sad 1.1%
Fear 0.3%
Happy 0.3%
Confused 0.1%
Disgusted 0.1%
Angry 0.1%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Helicopter 60.8%
Helmet 60.3%

Text analysis

Amazon

18108
18108.
NAMTBA3

Google

18108.
18108. 18108.