Human Generated Data

Title

Untitled (man and girl looking at model train set)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17674

Human Generated Data

Title

Untitled (man and girl looking at model train set)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Human 98.7
Person 98.7
Interior Design 98.5
Indoors 98.5
Person 90
Room 87.3
Clinic 76.8
Lab 57.6

Imagga
created on 2022-02-26

dishwasher 100
white goods 100
home appliance 88
appliance 62.6
durables 29.2
home 27.1
people 26.2
man 22.8
interior 21.2
indoors 20.2
house 19.2
adult 18.8
computer 18.5
smiling 17.4
person 17.3
male 17
lifestyle 16.6
work 16.6
kitchen 16.4
office 15.5
table 15.3
room 15.3
business 15.2
technology 14.8
women 14.2
laptop 13.7
modern 13.3
businessman 13.2
job 12.4
holding 12.4
men 12
working 11.5
sitting 11.2
indoor 11
cheerful 10.6
hospital 10.4
professional 10.1
smile 10
hand 9.9
health 9.7
medical 9.7
doctor 9.4
happy 9.4
glass 9.3
furniture 9.1
medicine 8.8
looking 8.8
design 8.4
portrait 8.4
equipment 8.4
shop 8.3
shopping 8.3
human 8.2
occupation 8.2
couple 7.8
wall 7.7
clinic 7.7
casual 7.6
manager 7.4
executive 7.4
inside 7.4
desk 7.2
science 7.1
worker 7.1
day 7.1
architecture 7

Google
created on 2022-02-26

White 92.2
Black 89.6
Black-and-white 85.9
Style 84
Monochrome 76
Monochrome photography 75
Rectangle 70.4
Engineering 69.3
Event 67.1
Stock photography 66
Urban design 65.6
Glass 65.4
Science 64.3
Ceiling 64.3
Art 62.8
Room 62
Metal 61.1
Machine 60.2
Font 60.1
Building 54.5

Microsoft
created on 2022-02-26

text 98.2
wall 98
indoor 85.1
black and white 79.6

Face analysis

Amazon

AWS Rekognition

Age 25-35
Gender Male, 98.2%
Calm 99.4%
Surprised 0.2%
Sad 0.2%
Happy 0.1%
Disgusted 0%
Angry 0%
Confused 0%
Fear 0%

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

an old photo of a person 59.2%
a person standing in front of a mirror posing for the camera 43.7%
a person standing in front of a mirror posing for the camera 27.8%

Text analysis

Amazon

KODAK-AVEELA

Google

KODVK-2VEELA
KODVK-2VEELA