Human Generated Data

Title

Untitled (mother seated with son and daughter on sofa in front of window)

Date

c. 1955

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12554

Human Generated Data

Title

Untitled (mother seated with son and daughter on sofa in front of window)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1955

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12554

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99
Human 99
Person 99
Person 97.5
Interior Design 96.3
Indoors 96.3
Living Room 95.4
Room 95.4
Furniture 95.1
Chair 93.2
Couch 89.4
Table Lamp 88.8
Lamp 88.8
Bed 76.2
Female 73.8
People 72.1
Housing 67.3
Building 67.3
Face 62.7
Kid 62.6
Child 62.6
Girl 62.5
Meal 58.8
Food 58.8
Window 58.3
Dining Room 57
Dining Table 56.6
Table 56.6
Flooring 56.3

Clarifai
created on 2023-10-27

people 99.9
group 98.3
room 98
furniture 97.8
adult 97.5
child 97.5
sit 95.7
woman 95
group together 93.9
man 92.4
home 92.3
seat 91.9
boy 91.9
education 91.6
offspring 90.4
two 90.1
indoors 87.4
family 87.3
sibling 87.1
dining room 83.9

Imagga
created on 2022-01-29

kitchen 54.6
interior 46
dishwasher 40.7
home 35.1
white goods 35
room 34.5
furniture 32.9
table 32.8
house 32.6
home appliance 28.1
bakery 26.6
appliance 26.2
modern 25.2
stove 25.1
restaurant 25
shop 23.8
indoor 21.9
chair 21.1
decor 20.3
counter 20.2
luxury 19.7
oven 19.6
food 17.7
glass 17.1
mercantile establishment 16.8
indoors 16.7
cabinet 16.7
design 16.3
contemporary 16
floor 15.8
cook 15.5
cooking 14.8
man 14.8
dining 14.3
wood 14.2
inside 13.8
decoration 13.7
lifestyle 13.7
sink 13.7
plate 13.6
style 13.3
clinic 13.1
hotel 12.4
service 12
refrigerator 11.8
architecture 11.7
clean 11.7
dining table 11.7
window 11.4
dinner 11.3
place of business 11.2
faucet 10.8
stainless 10.7
people 10.6
apartment 10.5
expensive 10.5
elegant 10.3
breakfast 10.2
smiling 10.1
happy 10
male 10
work 9.7
domestic 9.4
eat 9.2
durables 9.2
meal 9.2
holding 9.1
person 9
cheerful 8.9
wooden 8.8
luxurious 8.8
door 8.7
setting 8.7
residential 8.6
estate 8.5
two 8.5
drink 8.4
coffee 8.3
area 8.3
light 8
women 7.9
cabinets 7.9
remodel 7.9
furnishing 7.9
couple 7.8
hospital 7.7
sliding door 7.7
bar 7.4
cafeteria 7.3
new 7.3
dish 7.2
adult 7.1
building 7.1
steel 7.1
machine 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

text 97.9
indoor 95.4
house 93.7
window 91.6
table 86.2
furniture 84.7
person 82
clothing 76.9
black and white 56.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 95.3%
Calm 94.1%
Surprised 5.6%
Angry 0.1%
Disgusted 0.1%
Confused 0%
Sad 0%
Happy 0%
Fear 0%

AWS Rekognition

Age 25-35
Gender Female, 74.7%
Surprised 99.2%
Calm 0.3%
Happy 0.2%
Confused 0.1%
Disgusted 0%
Fear 0%
Angry 0%
Sad 0%

AWS Rekognition

Age 1-7
Gender Male, 63%
Calm 98.8%
Surprised 0.6%
Sad 0.2%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%
Fear 0.1%
Confused 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Bed
Person 99%
Person 99%
Person 97.5%
Bed 76.2%

Text analysis

Amazon

3
7
:D) 3 7
YT37A2
MMR YT37A2
MMR
:D)

Google

MNA YT37A2 032n
MNA
YT37A2
032n