Human Generated Data

Title

Untitled (two girls reading comics on floor)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17719

Human Generated Data

Title

Untitled (two girls reading comics on floor)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17719

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Furniture 98.4
Living Room 95.3
Indoors 95.3
Room 95.3
Human 93.2
Home Decor 89.4
Person 87.1
Couch 84.6
Fireplace 83.7
People 79.4
Clothing 78.5
Apparel 78.5
Person 75.2
Female 70.9
Chair 70.7
Art 67.9
Person 67.7
Screen 67.2
Electronics 67.2
Girl 64.4
Bed 63.8
Photography 62.9
Photo 62.9
Table 58.2
Cabinet 57.4
Bedroom 56.8
Monitor 55.9
Display 55.9

Clarifai
created on 2023-10-29

people 99.7
adult 97.8
two 96.9
man 96
group 93.8
one 93.1
furniture 92.9
monochrome 92.4
home 90.3
room 89.4
woman 88.4
three 88.1
group together 87.2
child 86.8
seat 85.7
veil 85.5
wear 85.4
art 83.1
recreation 82.8
street 80.4

Imagga
created on 2022-02-26

chair 41.3
seat 29.3
rocking chair 25.4
furniture 19.8
old 12.5
shopping cart 12.1
black 11.4
device 10.9
man 10.7
work 10.4
equipment 10.3
sky 10.2
holiday 10
basket 9.9
musical instrument 9.9
metal 9.7
empty 9.4
helmet 9.3
travel 9.1
people 8.9
wicker 8.8
person 8.8
container 8.3
bass 8.3
shopping 8.3
furnishing 8.2
wheelchair 8.1
football helmet 8.1
interior 8
handcart 7.9
glass 7.8
cart 7.8
wheeled vehicle 7.6
human 7.5
backboard 7.5
decoration 7.5
floor 7.4
water 7.3
transport 7.3
building 7.2
adult 7.1
summer 7.1
day 7.1
architecture 7

Microsoft
created on 2022-02-26

text 98.3
furniture 76.5
house 71.6
fireplace 68.3
old 42.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-31
Gender Male, 93.1%
Calm 56.6%
Happy 38.4%
Sad 2.4%
Disgusted 1.2%
Confused 0.6%
Angry 0.3%
Fear 0.3%
Surprised 0.2%

Feature analysis

Amazon

Person
Person 87.1%
Person 75.2%
Person 67.7%

Captions

Microsoft
created on 2022-02-26

an old photo of a person 54.1%
old photo of a person 51.3%
a old photo of a person 48.7%

Text analysis

Amazon

010
21

Google

TA3A2.
TA3A2.