Human Generated Data

Title

Untitled (man driving boat and wearing captain's hat)

Date

c. 1945

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7481

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man driving boat and wearing captain's hat)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.8
Human 99.8
Chair 97.5
Furniture 97.5
Apparel 94.5
Clothing 94.5
Hat 92.2
Accessories 64.6
Accessory 64.6
Goggles 64.6
Finger 63

Imagga
created on 2022-01-09

person 37.9
laptop 37.7
computer 30.6
man 29.6
people 27.9
sitting 25.8
working 24.7
business 23.7
technology 21.5
work 20.9
adult 20.8
outdoors 20.3
male 19.9
lifestyle 18.1
professional 17.3
happy 16.9
office 16.1
attractive 15.4
job 15
pretty 14.7
corporate 14.6
smile 14.3
portrait 14.2
businessman 14.1
smiling 13.7
casual 13.6
disk jockey 13.3
seller 12.8
worker 12.8
wireless 12.4
hair 11.9
businesswoman 11.8
communication 11.8
newspaper 11.7
executive 11.3
scholar 11.3
men 11.2
model 10.9
notebook 10.9
broadcaster 10.7
using 10.6
cheerful 10.6
looking 10.4
product 10.3
device 10.2
20s 10.1
handsome 9.8
suit 9.8
success 9.7
outside 9.4
manager 9.3
phone 9.2
relaxation 9.2
successful 9.2
relaxing 9.1
fashion 9
intellectual 9
water 8.7
day 8.6
happiness 8.6
holiday 8.6
electronics 8.5
face 8.5
career 8.5
sport 8.4
modern 8.4
studio 8.4
hand 8.4
clothing 8.3
communicator 8.2
fun 8.2
vacation 8.2
student 8.2
lady 8.1
creation 7.9
beach 7.8
expression 7.7
joy 7.5
one 7.5
occupation 7.3
bench 7.2
chair 7.2
cute 7.2
vehicle 7.1
women 7.1
travel 7
indoors 7
musical instrument 7

Microsoft
created on 2022-01-09

text 99.8
person 92.4
man 90.7
clothing 84.4
black and white 62.2

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Male, 99.9%
Happy 92.7%
Calm 3.7%
Sad 1.3%
Surprised 0.9%
Confused 0.5%
Angry 0.3%
Disgusted 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Hat 92.2%

Captions

Microsoft

a man sitting in a box 65.4%
a man sitting in front of a box 61.7%
a man sitting on a bench 50.3%

Text analysis

Amazon

5
د8
68988
KODYK-COMEETA

Google

5
58
YT37A°2-XAGO
5 68988 58 YT37A°2-XAGO
68988