Human Generated Data

Title

Untitled (two girls and a boy having a tea party)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15601

Human Generated Data

Title

Untitled (two girls and a boy having a tea party)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15601

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 97.2
Human 97.2
Person 96.9
Person 92.4
Furniture 90.3
Living Room 84
Room 84
Indoors 84
Painting 81.4
Art 81.4
Couch 78.5
Face 75.4
Clothing 71.6
Apparel 71.6
People 65.8
Sitting 64.3
Photography 60.4
Photo 60.4
Female 57.5
Table 57.4
Shelf 56.6

Clarifai
created on 2023-10-28

people 99.9
furniture 99.1
adult 97.2
room 96.6
woman 96.1
one 96.1
monochrome 96
chair 95.7
seat 93
two 92.9
wear 92.4
music 91.8
dress 91.5
sit 91.5
group 91.3
man 90.8
actress 90.7
veil 88.8
musician 88.6
art 88.5

Imagga
created on 2022-02-05

office 31.5
man 30.9
people 29
person 28.6
desk 27.7
business 26.7
television 26
businessman 24.7
male 24.1
table 23.9
room 22.7
sitting 21.5
work 21.2
furniture 20.9
computer 20.5
home 19.1
monitor 18.7
happy 16.9
professional 16.7
telecommunication system 16.7
indoor 16.4
job 15.9
laptop 15.2
team 15.2
blackboard 15.2
adult 15
men 14.6
businesswoman 14.5
chair 14.4
indoors 14
women 13.4
interior 13.3
smiling 13
smile 12.8
boss 12.4
working 12.4
occupation 11.9
confident 11.8
lifestyle 11.6
lady 11.3
meeting 11.3
senior 11.2
teamwork 11.1
house 10.9
family 10.7
shop 10
furnishing 9.8
modern 9.8
portrait 9.7
technology 9.6
couple 9.6
love 9.5
career 9.5
corporate 9.4
executive 9.4
manager 9.3
mature 9.3
communication 9.2
screen 9.1
alone 9.1
suit 9.1
pretty 9.1
attractive 9.1
cheerful 8.9
worker 8.9
group 8.9
looking 8.8
couch 8.7
teacher 8.7
mother 8.5
old 8.4
back 8.3
classroom 8.2
one 8.2
musical instrument 8.2
sexy 8
design 7.9
child 7.8
education 7.8
barbershop 7.8
leader 7.7
reading 7.6
relax 7.6
fun 7.5
silhouette 7.4
phone 7.4
happiness 7

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 96.3
black and white 81.4
furniture 68.4
person 66
clothing 58.2
desk 9.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 10-18
Gender Female, 97.9%
Calm 98.4%
Sad 0.8%
Surprised 0.3%
Disgusted 0.2%
Confused 0.2%
Angry 0.1%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 13-21
Gender Female, 85.6%
Calm 99.8%
Sad 0.1%
Surprised 0%
Disgusted 0%
Confused 0%
Angry 0%
Happy 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Painting
Person 97.2%
Person 96.9%
Person 92.4%
Painting 81.4%

Categories