Human Generated Data

Title

Untitled (little boy feeding baby)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17121

Human Generated Data

Title

Untitled (little boy feeding baby)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Chair 99.3
Furniture 99.3
Apparel 98.9
Footwear 98.9
Clothing 98.9
Shoe 98.9
Person 98.8
Human 98.8
Person 97.2
Shoe 96.2
Person 95.5
Shoe 90.9
Face 85.2
Indoors 82
Room 82
Food 81.9
Meal 81.9
Suit 77.8
Coat 77.8
Overcoat 77.8
Table 74.3
Female 71.7
Dining Table 71.1
Portrait 66.4
Photo 66.4
Photography 66.4
Floor 65.4
Flooring 65.2
Blonde 63.4
Teen 63.4
Woman 63.4
Girl 63.4
Child 63.4
Kid 63.4
Dish 63.1
People 61.6
Door 58
Sleeve 57.6

Imagga
created on 2022-02-26

chair 80.3
seat 44.8
furniture 39.8
room 39.3
table 39.3
interior 37.1
house 23.4
folding chair 22.6
floor 21.4
classroom 20.7
home 19.9
chairs 19.6
wood 17.5
inside 17.5
indoors 16.7
modern 15.4
blackboard 15.3
equipment 15
kitchen 14.7
design 14.6
rocking chair 14.4
dining 14.3
restaurant 14.2
indoor 13.7
style 13.3
decor 13.2
furnishing 11.8
glass 10.9
building 10.7
device 10.6
plant 10.4
contemporary 10.3
luxury 10.3
wall 10.2
architecture 10.1
lifestyle 10.1
stylish 9.9
stool 9.8
support 9.5
light 9.3
window 9.1
tables 8.9
person 8.8
wooden 8.8
school 8.7
class 8.7
residential 8.6
empty 8.6
sitting 8.6
nobody 8.5
drink 8.3
health 8.3
brass 8.1
decoration 8
lunch 7.9
stove 7.9
outside 7.7
decorate 7.6
dinner 7.6
elegance 7.5
weight 7.5
leisure 7.5
outdoors 7.5
man 7.4
cook 7.3
group 7.2
people 7.2
summer 7.1
day 7.1

Microsoft
created on 2022-02-26

indoor 96.6
table 95.6
furniture 95.3
chair 93.6
black and white 91.4
person 88.5
clothing 82.3
room 75.2
text 60.3

Face analysis

Amazon

Google

AWS Rekognition

Age 6-12
Gender Female, 97.4%
Calm 82.1%
Sad 10.3%
Happy 3.4%
Angry 1%
Disgusted 1%
Confused 0.9%
Surprised 0.8%
Fear 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Chair 99.3%
Shoe 98.9%
Person 98.8%

Captions

Microsoft

a group of people in a room 91%
a group of people sitting in chairs in a room 86.8%
a group of people sitting in a room 84.9%

Text analysis

Amazon

ACION
YY3 14.2 ACION
14.2
YY3