Human Generated Data

Title

Untitled (men and women on boat and dock with fish)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.8987

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (men and women on boat and dock with fish)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Person 99.4
Person 99
Person 89.9
Aircraft 87.1
Helicopter 87.1
Transportation 87.1
Vehicle 87.1
Person 86.9
Person 79.2
Clothing 77.7
Apparel 77.7
Building 77.1
Water 74.7
Waterfront 74.7
Machine 74.2
Shorts 74.1
Dock 69.7
Port 69.7
Harbor 69.7
Pier 69.7
Airplane 66.3
Military 60.7
Cruiser 57.3
Ship 57.3
Navy 57.3
Biplane 56.3
Sailor Suit 55.8

Imagga
created on 2022-01-09

cockpit 39.1
man 24.2
war 21.8
male 21.3
equipment 20.3
vehicle 19.6
military 19.3
uniform 16.6
stage 16.6
industrial 16.3
engine 15.4
work 14.9
weapon 14.6
television camera 14.3
person 14.3
power 14.3
metal 13.7
industry 13.6
mechanical 13.6
factory 13.5
transportation 13.4
machine 13.3
engineer 13.3
steel 13.2
battle 12.7
army 12.7
technology 12.6
protection 11.8
gun 11.8
soldier 11.7
people 11.7
car 11.6
business 11.5
television equipment 11.5
device 11.5
engineering 11.4
platform 11.3
aviator 11.3
adult 11
danger 10.9
tank 10.7
automobile 10.5
transport 10
camouflage 9.8
outdoors 9.7
men 9.4
professional 9.3
training 9.2
occupation 9.2
worker 8.9
warfare 8.9
electronic equipment 8.8
mechanic 8.8
military vehicle 8.6
construction 8.5
wheel 8.5
leisure 8.3
playing 8.2
job 8
manufacturing 7.8
conflict 7.8
color 7.8
clothing 7.8
emergency 7.7
mask 7.7
ship 7.6
safety 7.4
building 7.2
recreation 7.2
history 7.1
game 7.1
to 7.1
working 7.1
sky 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 99.4
person 90.4
ship 85.8
clothing 85.4
black and white 72.7
man 56.5

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 52.9%
Calm 96.9%
Surprised 1.5%
Confused 0.4%
Sad 0.3%
Angry 0.3%
Disgusted 0.2%
Happy 0.2%
Fear 0.2%

AWS Rekognition

Age 24-34
Gender Female, 63%
Happy 81%
Sad 8.7%
Calm 2.8%
Surprised 2.1%
Fear 2.1%
Confused 1.6%
Angry 0.9%
Disgusted 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Helicopter 87.1%
Airplane 66.3%

Captions

Microsoft

a group of people on a boat 39.8%
a group of people standing on a boat 36.9%
a group of people in a boat 36.8%

Text analysis

Amazon

TER
42484
SHOT
N
WED
BOAT
TER BOAT YT
HELCAT
II
E
PHONE
HO
OFFS
YT
FAST
HINE
CAR
SHOT and WED
CAR 50
50
FA
KODAK
k
2
SPORT3
roe
k Pranett
-
SAME
Ca
Pranett
not
and
ASSOCIAT

Google

HELCAT SPORT HIN TER BOAT SHOT WED CA 4248 f HOP
HOP
TER
WED
HIN
BOAT
CA
HELCAT
4248
SPORT
SHOT
f