Human Generated Data

Title

Untitled (little boy and girl with carved pumkin on front steps)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16959

Human Generated Data

Title

Untitled (little boy and girl with carved pumkin on front steps)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.6
Apparel 99.6
Person 99.6
Human 99.6
Person 98.2
Shorts 91.2
Flooring 88.5
Floor 83.3
People 82.7
Chair 73.1
Furniture 73.1
Footwear 72.3
Female 72.3
Wood 67.8
Sport 67.8
Sports 67.8
Shoe 66.5
Girl 62.9
Leisure Activities 61.6
Face 59.9
Portrait 59.9
Photo 59.9
Photography 59.9
Child 58.8
Kid 58.8
Couch 58.2
Icing 55.9
Food 55.9
Dessert 55.9
Cream 55.9
Creme 55.9
Cake 55.9

Imagga
created on 2022-02-26

man 33.2
person 32.7
ballplayer 28.1
player 26.9
people 26.8
sword 24.7
athlete 23.4
male 22.1
adult 21.2
happy 20
weapon 19.7
contestant 19
home 17.5
professional 17.1
smiling 16.6
child 16.3
sitting 14.6
smile 13.5
boy 13
men 12.9
happiness 12.5
family 12.5
indoors 12.3
lifestyle 12.3
room 12.2
two 11.9
house 11.7
baseball glove 11.7
sport 11.7
portrait 11.6
teacher 11.6
holding 11.6
standing 11.3
ball 11
playing 10.9
couple 10.5
office 10.4
youth 10.2
student 10
nurse 9.9
handsome 9.8
mask 9.8
interior 9.7
businessman 9.7
uniform 9.5
toilet tissue 9.5
golfer 9.3
face 9.2
leisure 9.1
business 9.1
exercise 9.1
health 9
black 9
active 9
fun 9
kid 8.9
to 8.9
together 8.8
senior 8.4
teen 8.3
dress 8.1
working 8
clothing 7.9
equipment 7.8
world 7.8
education 7.8
life 7.8
play 7.8
patient 7.7
helmet 7.6
casual 7.6
tissue 7.5
club 7.5
joy 7.5
alone 7.3
teenager 7.3
children 7.3
looking 7.2
recreation 7.2
childhood 7.2
game 7.1
women 7.1
job 7.1
school 7.1
medical 7.1
football helmet 7
look 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

person 97.7
clothing 94
text 90.3
black and white 88.2
footwear 76.1
human face 61.1

Face analysis

Amazon

Google

AWS Rekognition

Age 23-33
Gender Female, 55.1%
Calm 85.8%
Sad 13.1%
Confused 0.5%
Surprised 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Shoe 66.5%

Captions

Microsoft

a man standing next to a window 62.1%
a man standing in front of a window 62%
a young man standing next to a window 37.1%

Text analysis

Amazon

6
NACION
TESAS NACION
TESAS

Google

NAGON
NAGON