Human Generated Data

Title

Untitled (man holding tray with picture of his trailer)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7570

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man holding tray with picture of his trailer)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 98.8
Human 98.8
Apparel 94.1
Clothing 94.1
Meal 86.3
Food 86.3
Furniture 82.6
Chair 82.6
Shorts 81.2
Face 75.5
Helmet 74.2
Dish 73.1
Plant 66.6
Photography 66
Photo 66
Portrait 66
Building 62.2
Sleeve 59.2
Pants 58.9
Urban 55.2

Imagga
created on 2022-01-08

newspaper 49
product 38
creation 29.6
man 29.6
person 26
male 25.5
people 25.1
business 23.7
businessman 19.4
work 18.8
adult 17.2
office 16.7
looking 16
technology 15.6
professional 15.1
building 15
laptop 14.8
shop 14.3
smiling 13.7
modern 13.3
indoors 13.2
men 12.9
sitting 12.9
computer 12.8
job 12.4
restaurant 11.8
serious 11.4
portrait 11
lifestyle 10.8
happy 10.6
working 10.6
group 10.5
blackboard 10.4
senior 10.3
screen 10.3
money 10.2
teamwork 10.2
finance 10.1
smile 10
student 10
face 9.9
room 9.5
bartender 9.5
occupation 9.2
hand 9.1
worker 9.1
health 9
human 9
team 9
medical 8.8
home 8.8
chart 8.6
film 8.5
doctor 8.5
manager 8.4
old 8.4
dollar 8.3
music 8.2
businesswoman 8.2
suit 8.1
success 8
financial 8
table 7.9
mercantile establishment 7.8
education 7.8
thoughtful 7.8
corporate 7.7
laboratory 7.7
exam 7.7
plan 7.6
writing 7.5
mature 7.4
chair 7.4
holding 7.4
world 7.4
bar 7.4
alone 7.3
new 7.3
currency 7.2
medicine 7
architecture 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 97.7
person 97.3
clothing 86.6
black and white 79.5
smile 77.3
human face 74.3
man 72.4

Face analysis

Amazon

Google

AWS Rekognition

Age 54-62
Gender Male, 99.7%
Confused 53.2%
Calm 34.7%
Sad 5%
Surprised 2.2%
Disgusted 1.7%
Happy 1.5%
Fear 1%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%
Helmet 74.2%

Captions

Microsoft

an old photo of a person 89.7%
an old photo of a person 88.7%
a person posing for the camera 88.6%

Text analysis

Amazon

WILSON
LIZZIE
MASS
JOHN
JOHN & LIZZIE
&
BOSTON MASS
BOSTON
L.76

Google

16
XAGO
BOSTON
MJI7--YT37A°2
MASS
MJI7--YT37A°2 -- XAGO JOMN & LIZZIE WILSON BOSTON MASS 16
--
&
LIZZIE
WILSON
JOMN