Human Generated Data

Title

Untitled (three little girls having a tea party)

Date

May, 1957

People

Artist: Francis J. Sullivan, American 1916 - 1996

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17972

Human Generated Data

Title

Untitled (three little girls having a tea party)

People

Artist: Francis J. Sullivan, American 1916 - 1996

Date

May, 1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-04

Person 98.8
Human 98.8
Apparel 97.6
Clothing 97.6
Person 94.9
Furniture 93.8
Chair 93.8
Person 93.6
Helmet 91.1
Face 90.8
Cake 87.9
Dessert 87.9
Icing 87.9
Cream 87.9
Food 87.9
Creme 87.9
Tabletop 85.2
Table 81.3
Meal 77.2
Dish 75.1
Costume 74
Photography 66.9
Portrait 66.9
Photo 66.9
People 64.4
Hat 63.5
Bed 57.1
Dining Table 56.5
Person 56.3
Indoors 55.8

Imagga
created on 2022-03-04

man 32.9
person 26.5
people 23.4
male 23.4
computer 22
device 19.9
adult 19.1
laptop 18.7
working 16.8
surgeon 16.6
room 16.3
home 15.9
indoors 15.8
office 15.7
work 15.7
sitting 15.5
business 14.6
equipment 14.4
worker 14.3
men 13.7
table 13.6
professional 13.4
technology 13.3
lifestyle 13
desk 12.7
job 12.4
nurse 12.3
instrument 11.5
electric chair 11.2
patient 11.1
communication 10.9
happy 10.6
senior 10.3
black 10.2
projector 10.2
modern 9.8
chair 9.4
industry 9.4
phone 9.2
studio 9.1
monitor 9
instrument of execution 9
interior 8.8
businessman 8.8
looking 8.8
casual 8.5
health 8.3
smiling 8
machine 7.9
smile 7.8
old 7.7
seat 7.6
businesspeople 7.6
meeting 7.5
keyboard 7.5
human 7.5
retro 7.4
occupation 7.3
hospital 7.3
lady 7.3
businesswoman 7.3
group 7.2
team 7.2
women 7.1
medical 7.1

Microsoft
created on 2022-03-04

text 99
table 90.5
black and white 90.4
furniture 74.7
person 64.1
clothing 63.7

Face analysis

Amazon

AWS Rekognition

Age 6-16
Gender Female, 92.9%
Happy 46.6%
Sad 25.3%
Calm 21%
Fear 3.4%
Confused 1.2%
Disgusted 1.1%
Surprised 1%
Angry 0.4%

Feature analysis

Amazon

Person 98.8%
Helmet 91.1%

Captions

Microsoft

a group of people looking at a book 45.1%
a group of people riding on the back of a book 28.7%
a group of people riding on top of a book 27.9%

Text analysis

Amazon

|
OCTOR
| WARRED
Sand
WARRED