Human Generated Data

Title

Untitled (children playing with toys on floor)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16857

Human Generated Data

Title

Untitled (children playing with toys on floor)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16857

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 96.7
Human 96.7
Person 96.4
Person 90.1
Furniture 83.1
Person 79.5
Workshop 76.5
Person 75.4
Indoors 69.6
Room 67.4
Person 64.1
Chair 64.1
Face 61.9
Flooring 61.1
Clothing 57.7
Apparel 57.7

Clarifai
created on 2023-10-29

people 99.9
guitar 99.4
adult 98.9
man 98.3
monochrome 98.2
music 97.5
two 97.3
stringed instrument 96.3
instrument 96.1
one 95.6
group 95.4
group together 94.5
musician 94.2
guitarist 93.7
furniture 90.9
portrait 90.2
recreation 89.1
three 87.1
home 85.2
chair 83

Imagga
created on 2022-02-26

stringed instrument 76.9
musical instrument 67.8
device 38.4
man 28.2
people 27.3
person 27.2
indoors 23.7
sitting 23.2
adult 23.2
lifestyle 21.7
table 20.8
male 20.6
women 20.5
home 18.3
room 18.2
portrait 17.5
smiling 17.4
happy 16.3
smile 15.7
computer 15.3
business 15.2
cheerful 14.6
work 14.4
happiness 14.1
dulcimer 14
percussion instrument 13.6
chair 13.3
sitar 13.2
laptop 12.9
indoor 12.8
casual 12.7
job 12.4
desk 12.3
mature 12.1
men 12
communication 11.7
holding 11.5
working 11.5
businessman 11.5
together 11.4
relaxation 10.9
black 10.8
worker 10.7
office 10.7
interior 10.6
enjoying 10.4
looking 10.4
technology 10.4
face 9.9
grand piano 9.8
group 9.7
sit 9.5
day 9.4
piano 9.3
horizontal 9.2
relaxing 9.1
cup 9
banjo 8.9
lady 8.9
color 8.9
teacher 8.8
boy 8.7
education 8.7
play 8.6
drinking 8.6
enjoyment 8.4
senior 8.4
modern 8.4
old 8.4
coffee 8.3
phone 8.3
music 8.3
student 8.2
alone 8.2
guitar 8.2
success 8
pretty 7.7
attractive 7.7
youth 7.7
two 7.6
reading 7.6
talking 7.6
house 7.5
instrument 7.5
one 7.5
camera 7.4
book 7.3
businesswoman 7.3
sexy 7.2
wind instrument 7.2
bright 7.1
handsome 7.1
classroom 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 97.8
person 85.1
black and white 84.3
drawing 73.6
man 55.5
clothing 53.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 56.9%
Calm 96%
Happy 2.5%
Disgusted 0.3%
Angry 0.3%
Surprised 0.3%
Confused 0.2%
Fear 0.2%
Sad 0.1%

AWS Rekognition

Age 41-49
Gender Female, 78.2%
Calm 65.3%
Happy 16.6%
Sad 8.7%
Surprised 3%
Confused 2.4%
Fear 1.5%
Disgusted 1.3%
Angry 1.2%

AWS Rekognition

Age 30-40
Gender Male, 92%
Calm 96.2%
Happy 1.6%
Sad 1.5%
Confused 0.2%
Angry 0.2%
Surprised 0.1%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 42-50
Gender Male, 93.2%
Calm 87.9%
Sad 5.6%
Happy 2.8%
Fear 1.1%
Angry 0.9%
Disgusted 0.7%
Confused 0.6%
Surprised 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chair
Person 96.7%
Person 96.4%
Person 90.1%
Person 79.5%
Person 75.4%
Person 64.1%
Chair 64.1%

Categories

Captions

Text analysis

Amazon

KODAK-SVEELA
Modyle

Google

YT33A2-XAGON
YT33A2-XAGON