Human Generated Data

Title

Untitled (couple in roller coaster car)

Date

1952

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4672

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (couple in roller coaster car)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1952

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 96.8
Person 96.8
Person 95.4
Apparel 87.4
Clothing 87.4
Person 86.5
Chair 78.3
Furniture 78.3
Building 67.3
Coat 60.6
Face 58.1
Transportation 58.1
Vehicle 58.1
Shorts 57.9

Imagga
created on 2021-12-14

dishwasher 71.3
white goods 56.8
home appliance 43.8
appliance 31.4
shopping cart 19.6
shopping 18.3
business 17.6
architecture 17.2
building 16.8
buy 16
house 15.9
construction 15.4
work 15.1
supermarket 14.8
newspaper 14.8
product 14.3
durables 14.3
basket 14.2
people 13.9
man 13.4
chair 13.3
plan 13.2
shop 13.1
male 12.8
cart 12.7
technology 12.6
architect 12.5
commerce 12.1
drawing 12.1
metal 12.1
trolley 11.8
handcart 11.7
market 11.5
job 10.6
home 10.4
empty 10.3
sale 10.2
creation 9.9
wheeled vehicle 9.7
working 9.7
design 9.6
container 9.5
store 9.4
equipment 9.4
person 9.4
sky 8.9
buying 8.7
lifestyle 8.7
purchase 8.7
project 8.7
trade 8.6
engineering 8.6
development 8.6
seat 8.3
businessman 7.9
sketch 7.9
adult 7.9
day 7.8
travel 7.7
3d 7.7
men 7.7
apartment 7.7
office 7.5
commercial 7.5
city 7.5
floor 7.4
table 7.4
window 7.4
street 7.4
light 7.3
computer 7.3
black 7.2
science 7.1
worker 7.1
interior 7.1
professional 7
modern 7

Microsoft
created on 2021-12-14

outdoor 99.3
text 91.6
ship 81.2
black and white 80.8
old 60.4

Face analysis

Amazon

Google

AWS Rekognition

Age 49-67
Gender Female, 59.7%
Calm 32.4%
Confused 25.5%
Sad 17.7%
Angry 13.3%
Happy 7%
Surprised 1.7%
Disgusted 1.7%
Fear 0.7%

AWS Rekognition

Age 23-37
Gender Male, 54.4%
Happy 61.1%
Calm 31.1%
Sad 4%
Surprised 1.5%
Confused 1.2%
Angry 0.6%
Fear 0.3%
Disgusted 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%

Captions

Microsoft

a person sitting in front of a building 63%
a person standing in front of a building 62.9%
an old photo of a person 62.8%

Text analysis

Amazon

PHOTOS
AH
PHO
as

Google

AH
PHOTPS
PHO
PHOTPS PHO AH