Human Generated Data

Title

Untitled (three girls at tea with their dolls)

Date

c. 1950

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17971

Human Generated Data

Title

Untitled (three girls at tea with their dolls)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17971

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 99.1
Human 99.1
Person 97
Clothing 96.5
Apparel 96.5
Helmet 96.4
Person 94
Chair 79.6
Furniture 79.6
Room 65.7
Indoors 65.7
Overcoat 60.6
Coat 60.6
Flooring 59.3
Leisure Activities 58.1
Suit 57.3
Table 55.2

Clarifai
created on 2023-10-29

people 99.9
adult 98.4
furniture 98.4
group together 98.1
group 97.9
woman 97.2
wear 96.9
two 96.7
sitting 95.4
man 94.9
three 94.5
outfit 94.3
chair 93.8
recreation 93.2
seat 91.8
child 91.5
monochrome 91.5
four 90.2
actress 89.9
vehicle 87.9

Imagga
created on 2022-03-04

man 35.6
person 32.1
room 31.1
people 29
male 26.2
adult 22.4
teacher 20.8
home 19.9
stretcher 19.9
classroom 18.2
indoors 17.6
lifestyle 16.6
office 16.3
sitting 16.3
chair 16.1
litter 15.9
group 15.3
computer 15.3
men 14.6
women 14.2
meeting 14.1
table 14.1
business 14
team 12.5
together 12.3
laptop 12.2
conveyance 12
professional 11.9
happy 11.9
indoor 11.9
communication 11.7
smiling 11.6
interior 11.5
working 11.5
couple 11.3
occupation 11
desk 10.6
businessman 10.6
portrait 10.3
work 10.2
happiness 10.2
inside 10.1
suit 9.9
technology 9.6
talking 9.5
two 9.3
smile 9.3
educator 9.2
patient 9.1
fashion 9
black 9
hair 8.7
education 8.7
corporate 8.6
stringed instrument 8.6
musical instrument 8.5
face 8.5
bowed stringed instrument 8.5
relaxation 8.4
teamwork 8.3
phone 8.3
businesswoman 8.2
family 8
couch 7.7
modern 7.7
attractive 7.7
child 7.7
health 7.6
student 7.6
case 7.5
dark 7.5
relationship 7.5
exercise 7.3
music 7.2
grandfather 7.1
love 7.1
kid 7.1

Google
created on 2022-03-04

Microsoft
created on 2022-03-04

text 98.6
table 95
furniture 91.4
person 90.2
drawing 80.7
cartoon 71.8
clothing 70
black and white 61.2
chair 58

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 31-41
Gender Male, 98%
Calm 34.5%
Sad 30.4%
Surprised 12.3%
Confused 8.8%
Fear 7.9%
Disgusted 3%
Happy 1.7%
Angry 1.3%

AWS Rekognition

Age 27-37
Gender Male, 84.7%
Sad 95.1%
Angry 1.7%
Calm 1.6%
Confused 0.9%
Disgusted 0.4%
Surprised 0.2%
Happy 0.1%
Fear 0.1%

Feature analysis

Amazon

Person
Helmet
Person 99.1%
Person 97%
Person 94%
Helmet 96.4%

Text analysis

Amazon

MIR
3T
3T MIR YE3RAS
YE3RAS