Human Generated Data

Title

Untitled (girl sitting at table of toys in Christmas living room full of gifts)

Date

c. 1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9893

Human Generated Data

Title

Untitled (girl sitting at table of toys in Christmas living room full of gifts)

People

Artist: Martin Schweig, American 20th century

Date

c. 1940-1960

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Restaurant 98.5
Furniture 97.6
Chair 97.6
Human 96.8
Person 96.8
Person 95.2
Cafe 94.8
Cafeteria 78.6
Sitting 75.7
Table 68
Female 67.2
Photography 61.1
Portrait 61.1
Photo 61.1
Face 61.1
Food 57.8
Meal 57.8
Electronics 56.2
Screen 56.2
Food Court 55.8

Imagga
created on 2022-01-28

table 26.8
interior 24.7
chair 24.1
salon 23.5
computer 22.8
man 21.5
people 21.2
restaurant 18.3
business 18.2
room 18.1
work 17.7
indoors 17.6
lifestyle 17.3
technology 17.1
home 16.7
person 16.7
laptop 16.6
furniture 16.6
equipment 16.2
modern 16.1
office 15.7
sitting 15.4
working 14.1
appliance 13.9
communication 13.4
building 12.7
shop 12.6
hand blower 12.5
architecture 12.5
design 12.4
kitchen 12.3
device 12.2
floor 12.1
house 11.7
mixer 11.6
seat 11
dinner 10.9
electronic equipment 10.9
chairs 10.8
male 10.6
urban 10.5
adult 10.4
party 10.3
desk 10.3
women 10.3
glass 10.3
inside 10.1
occupation 10.1
indoor 10
dryer 10
blower 10
happy 10
wood 10
decor 9.7
together 9.6
luxury 9.4
contemporary 9.4
light 9.3
casual 9.3
home appliance 9.2
window 9.2
leisure 9.1
businesswoman 9.1
men 8.6
smile 8.5
relaxation 8.4
style 8.2
structure 8.1
group 8.1
success 8
job 8
decoration 8
smiling 7.9
durables 7.9
couple 7.8
lunch 7.7
elegant 7.7
kitchen appliance 7.6
dining 7.6
living 7.6
outdoors 7.5
musical instrument 7.3
meal 7.1
mercantile establishment 7
monitor 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

text 97.5
black and white 96
person 92.4
concert 91
musical instrument 85.4
furniture 83.5
guitar 80.6
chair 74
clothing 73.8
music 73.1
table 66.5
piano 66.3
cluttered 16.6

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 52.2%
Calm 88.8%
Surprised 4.6%
Sad 2.8%
Fear 1.6%
Angry 1.2%
Confused 0.5%
Disgusted 0.4%
Happy 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.8%

Captions

Microsoft

a group of people sitting at a table 70.4%
a group of people sitting around a table 69.7%
a group of people in a room 69.6%

Text analysis

Amazon

M
M 113 YT33A2 002MA
YT33A2
113
002MA
O

Google

MI3 YT33A2_032MA.
YT33A2_032MA.
MI3