Human Generated Data

Title

Untitled (men unloading cargo from ship)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7671

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men unloading cargo from ship)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 99.4
Person 99.4
Person 96.1
Indoors 94.1
Interior Design 94.1
Person 93.4
Person 91.7
Workshop 83.7
Person 72.3
Building 61.9
Stage 57.2
Clinic 55.3

Imagga
created on 2022-01-08

stage 54.8
platform 45.4
building 24.2
shop 22.9
architecture 22.8
modern 21
structure 16.9
mercantile establishment 16.9
interior 16.8
steel 16.8
city 16.6
barbershop 15.5
industrial 15.4
urban 14.8
industry 14.5
business 14
construction 12.8
room 12.6
house 12.5
new 12.1
metal 12.1
water 12
glass 11.7
chair 11.3
place of business 11.1
power 10.9
transportation 10.8
light 10.7
factory 10.6
roof 10.5
technology 10.4
sky 10.2
man 10.1
travel 9.9
balcony 9.7
indoors 9.7
people 9.5
work 9.4
floor 9.3
inside 9.2
bakery 9.1
office 9
table 8.8
home 8.8
lamp 8.7
apartment 8.6
outdoor 8.4
street 8.3
equipment 8.2
decoration 8
blackboard 7.8
pollution 7.7
skyscraper 7.7
buildings 7.6
window 7.5
plant 7.5
transport 7.3
road 7.2
tower 7.2
working 7.1
hall 7

Google
created on 2022-01-08

Building 85.5
Art 80.2
Machine 72.5
Room 69.8
House 67.8
Monochrome 65.9
Monochrome photography 65.3
Window 62.4
History 62.3
Wood 62.1
Illustration 56.1
Photographic paper 53.9
Visual arts 52.4

Microsoft
created on 2022-01-08

text 96.8
ship 91.5
person 71.7

Face analysis

Amazon

Google

AWS Rekognition

Age 21-29
Gender Male, 83.7%
Calm 68.7%
Confused 5.7%
Surprised 5%
Sad 4.7%
Disgusted 4.6%
Happy 4.3%
Angry 3.5%
Fear 3.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people standing in front of a window 74.8%
a group of people standing next to a window 74.7%
a group of people in a room 74.6%

Text analysis

Amazon

32
38
5
39
1
Co
DE Co
DE
No. 1
No.
No.3
FFCa

Google

No
1
32 38 39 UF Co No 1
Co
UF
32
38
39