Human Generated Data

Title

Untitled (three little girls having a tea party)

Date

May, 1957

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17972

Human Generated Data

Title

Untitled (three little girls having a tea party)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

May, 1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17972

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 98.8
Human 98.8
Clothing 97.6
Apparel 97.6
Person 94.9
Chair 93.8
Furniture 93.8
Person 93.6
Helmet 91.1
Face 90.8
Icing 87.9
Food 87.9
Cake 87.9
Dessert 87.9
Cream 87.9
Creme 87.9
Tabletop 85.2
Table 81.3
Meal 77.2
Dish 75.1
Costume 74
Portrait 66.9
Photography 66.9
Photo 66.9
People 64.4
Hat 63.5
Bed 57.1
Dining Table 56.5
Person 56.3
Indoors 55.8

Clarifai
created on 2023-10-29

people 99.8
group together 99.3
adult 98.9
man 97.8
wear 97.1
group 97
two 96.7
three 95.8
outfit 95.2
sitting 93.8
recreation 93.3
several 92.8
veil 92.4
vehicle 92.1
four 92
woman 91.3
one 90.7
child 90.2
monochrome 89.1
sports equipment 87.6

Imagga
created on 2022-03-04

man 32.9
person 26.5
people 23.4
male 23.4
computer 22
device 19.9
adult 19.1
laptop 18.7
working 16.8
surgeon 16.6
room 16.3
home 15.9
indoors 15.8
office 15.7
work 15.7
sitting 15.5
business 14.6
equipment 14.4
worker 14.3
men 13.7
table 13.6
professional 13.4
technology 13.3
lifestyle 13
desk 12.7
job 12.4
nurse 12.3
instrument 11.5
electric chair 11.2
patient 11.1
communication 10.9
happy 10.6
senior 10.3
black 10.2
projector 10.2
modern 9.8
chair 9.4
industry 9.4
phone 9.2
studio 9.1
monitor 9
instrument of execution 9
interior 8.8
businessman 8.8
looking 8.8
casual 8.5
health 8.3
smiling 8
machine 7.9
smile 7.8
old 7.7
seat 7.6
businesspeople 7.6
meeting 7.5
keyboard 7.5
human 7.5
retro 7.4
occupation 7.3
hospital 7.3
lady 7.3
businesswoman 7.3
group 7.2
team 7.2
women 7.1
medical 7.1

Microsoft
created on 2022-03-04

text 99
table 90.5
black and white 90.4
furniture 74.7
person 64.1
clothing 63.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Female, 92.9%
Happy 46.6%
Sad 25.3%
Calm 21%
Fear 3.4%
Confused 1.2%
Disgusted 1.1%
Surprised 1%
Angry 0.4%

Feature analysis

Amazon

Person
Helmet
Person 98.8%
Person 94.9%
Person 93.6%
Person 56.3%
Helmet 91.1%

Categories

Text analysis

Amazon

|
OCTOR
| WARRED
Sand
WARRED