Human Generated Data

Title

Untitled (boy and girl playing with stack of blocks)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17736

Human Generated Data

Title

Untitled (boy and girl playing with stack of blocks)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17736

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.3
Human 98.3
Person 98.2
Furniture 96.5
Clothing 95.5
Apparel 95.5
Shoe 94.5
Footwear 94.5
Chair 92.2
Shorts 79.9
Female 75.6
Drawing 74.7
Art 74.7
Advertisement 73.7
Poster 68.3
Home Decor 68
Collage 67.1
Canvas 66.5
Interior Design 66
Indoors 66
Shoe 64.1

Clarifai
created on 2023-10-29

people 99.9
adult 98.8
group 98
man 97.2
furniture 96.8
many 96.7
group together 96.6
child 96
woman 95.4
outfit 93.9
two 93
nostalgia 92.8
monochrome 92.6
wear 91.7
home 90.4
several 89.5
canine 88.7
leader 88.6
family 88.2
administration 87.7

Imagga
created on 2022-02-26

people 22.3
man 22.2
musical instrument 17.4
person 16
world 15
male 14.9
men 12.9
accordion 12.1
sport 10.5
adult 10.4
black 10.2
sky 10.2
wind instrument 10.1
city 10
history 9.8
chair 9.8
art 9.8
keyboard instrument 9.6
culture 9.4
wheeled vehicle 9.3
portrait 9.1
old 9
equipment 8.7
mask 8.6
travel 8.4
hand 8.3
room 8.2
shopping cart 8.2
shop 8.1
lifestyle 7.9
business 7.9
happiness 7.8
youth 7.7
outdoor 7.6
handcart 7.6
park 7.6
kin 7.6
historical 7.5
happy 7.5
tourism 7.4
symbol 7.4
active 7.4
exercise 7.3
hat 7.3
athlete 7.2
work 7.2
family 7.1
summer 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 94.3
clothing 92.5
person 91
footwear 83.8
black and white 81
cartoon 51
house 50.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Female, 99.4%
Calm 73.4%
Sad 23.1%
Angry 1%
Happy 0.9%
Fear 0.5%
Disgusted 0.4%
Confused 0.3%
Surprised 0.3%

AWS Rekognition

Age 9-17
Gender Female, 99.2%
Calm 94.8%
Sad 2.7%
Surprised 0.8%
Happy 0.7%
Disgusted 0.3%
Angry 0.3%
Confused 0.3%
Fear 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 98.3%
Person 98.2%
Shoe 94.5%
Shoe 64.1%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2022-02-26

an old photo of a boy 39.6%
an old photo of a person 39.5%
old photo of a person 39.4%

Text analysis

Amazon

M
G
R
S
A
W
W X
Err
X
T
O
KODAK-SLA
shar

Google

YT37A2 XA
YT37A2
XA