Human Generated Data

Title

Untitled (couple with baby)

Date

c. 1970

People

Artist: Bill Owens, American b. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1088

Human Generated Data

Title

Untitled (couple with baby)

People

Artist: Bill Owens, American b. 1938

Date

c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1088

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.3
Human 99.3
Refrigerator 87
Appliance 87
Interior Design 81.4
Indoors 81.4
Meal 78.2
Food 78.2
Home Decor 70.6
Plant 65.6
Person 63.1
Undershirt 57.4
Clothing 57.4
Apparel 57.4
Oven 56.5
Couch 56
Furniture 56
Shorts 55.4

Clarifai
created on 2023-10-25

people 99.9
couple 99
monochrome 98.9
woman 98.8
two 98.7
adult 98.7
man 98.5
three 97.2
group 96.4
portrait 96.1
child 93.5
girl 93.3
family 92.3
group together 89
smile 89
indoors 88.9
room 87.9
beautiful 87.6
furniture 84.2
offspring 84.2

Imagga
created on 2022-01-09

person 28.5
people 25.6
black 25.4
adult 24
man 23.5
women 19
barbershop 18
sexy 17.7
portrait 17.5
fashion 16.6
male 16.4
shop 16.4
lifestyle 15.9
model 15.5
interior 14.1
indoors 14.1
hair 13.5
elegance 13.4
pretty 13.3
attractive 13.3
smiling 13
body 12.8
human 12.7
one 12.7
modern 12.6
mercantile establishment 12.3
cheerful 12.2
face 12.1
sitting 12
happy 11.9
sensuality 11.8
dress 11.7
room 11.4
style 11.1
business 10.9
smile 10.7
office 10.5
call 10.5
brunette 10.5
studio 9.9
clothing 9.8
home 9.6
men 9.4
strength 9.4
employee 9.3
dark 9.2
gorgeous 9.1
music 9
posing 8.9
businessman 8.8
urban 8.7
couple 8.7
place of business 8.6
domestic 8.6
two 8.5
indoor 8.2
equipment 8.1
lady 8.1
worker 8.1
cute 7.9
elegant 7.7
waiter 7.7
communication 7.6
house 7.5
holding 7.4
salon 7.4
make 7.3
device 7.3
fitness 7.2
looking 7.2
job 7.1
working 7.1
life 7.1
happiness 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

person 99.5
text 98
clothing 95.7
indoor 94
black and white 91.6
food 84.5
man 78.3
table 75.5
human face 57.8
restaurant 50.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 27-37
Gender Female, 100%
Happy 96.8%
Surprised 0.6%
Fear 0.6%
Calm 0.5%
Sad 0.5%
Angry 0.5%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 0-3
Gender Male, 64.9%
Calm 95.4%
Happy 2.8%
Sad 0.6%
Surprised 0.4%
Disgusted 0.2%
Angry 0.2%
Fear 0.2%
Confused 0.2%

AWS Rekognition

Age 36-44
Gender Male, 100%
Calm 99.8%
Sad 0.1%
Confused 0%
Surprised 0%
Angry 0%
Happy 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 34
Gender Female

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%