Human Generated Data

Title

Untitled (girls on bunkbeds)

Date

early 1952, printed later

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.172

Human Generated Data

Title

Untitled (girls on bunkbeds)

People

Artist: Jack Gould, American

Date

early 1952, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.172

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Furniture 100
Bunk Bed 98.9
Person 94.7
Human 94.7
Chair 93.8
Person 89.2
Person 80.5
Bed 75.7
Person 63.3
Housing 62.4
Building 62.4

Clarifai
created on 2023-10-25

furniture 99.8
people 99.6
chair 98.1
group 98
seat 97.8
room 97.2
shelf 97.1
bed 96.8
child 95.5
man 95.4
two 94.7
adult 94
family 93.5
bookcase 92.7
woman 91.5
inside 90.9
one 90.8
vintage 89.6
set 88
indoors 85.8

Imagga
created on 2021-12-14

room 51.9
furniture 49.5
bed 40.1
bedroom 38.2
interior 35.4
car 34.3
motor vehicle 32.6
chair 31.5
model t 29.7
four-poster 28.8
house 23.4
home 22.3
lamp 20.1
armchair 18.6
luxury 18
bedroom furniture 17.3
wheeled vehicle 16.6
vehicle 16.1
table 15.7
hotel 15.3
seat 15
window 13.7
design 13.5
transportation 13.4
travel 13.4
decor 13.3
glass 13.2
indoors 13.2
furnishing 13
golf equipment 12.8
pillow 12.6
relax 12.6
modern 12.6
wood 12.5
inside 12
rest 11.1
transport 10.9
living 10.4
floor 10.2
sports equipment 9.6
empty 9.4
equipment 9.4
lifestyle 9.4
architecture 9.4
light 9.3
cabinet 9.2
city 9.1
business 9.1
old 9
style 8.9
wooden 8.8
apartment 8.6
comfortable 8.6
elegant 8.6
support 8.3
man 8.1
palanquin 7.9
china cabinet 7.9
sleep 7.8
nobody 7.8
comfort 7.7
wall 7.7
sofa 7.5
vacation 7.4
indoor 7.3
new 7.3
decoration 7.2
device 7.2
adult 7.1
night 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 97.5
table 95.1
indoor 87.2
chair 75.7
black and white 63.2
desk 59.9
house 56.3
furniture 33.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 99.2%
Calm 62%
Happy 16.5%
Surprised 13.2%
Fear 3.6%
Angry 1.5%
Confused 1.4%
Disgusted 1.2%
Sad 0.6%

AWS Rekognition

Age 8-18
Gender Female, 96.9%
Happy 93.5%
Surprised 3.3%
Calm 1%
Fear 0.9%
Angry 0.5%
Confused 0.4%
Sad 0.2%
Disgusted 0.2%

AWS Rekognition

Age 11-21
Gender Female, 99.5%
Calm 89.4%
Angry 4.9%
Surprised 2.1%
Confused 1.3%
Sad 1.3%
Fear 0.5%
Disgusted 0.2%
Happy 0.2%

AWS Rekognition

Age 2-8
Gender Female, 63.1%
Confused 48.8%
Calm 31.7%
Sad 5.7%
Surprised 5.4%
Fear 4.9%
Happy 1.8%
Angry 1.2%
Disgusted 0.4%

Microsoft Cognitive Services

Age 16
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 7
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.7%
Chair 93.8%
Bed 75.7%

Categories

Imagga

interior objects 100%

Captions

Text analysis

Amazon

Rinso

Google

iRinsel
iRinsel