Human Generated Data

Title

Untitled (woman holding package on head)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16660

Human Generated Data

Title

Untitled (woman holding package on head)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-18

Person 97.5
Human 97.5
Person 97
Person 96.6
Person 94.9
Person 94.6
Shop 93.4
Person 87
Mannequin 60.8
Window Display 59.2
Shelf 57
Boutique 55.2

Imagga
created on 2022-02-18

case 77.5
interior 73.4
room 66.5
house 56.9
home 54.4
furniture 52
apartment 47
modern 46.3
decor 41.6
architecture 37.5
table 37
design 36
floor 33.5
luxury 33.5
bathroom 32
indoors 28.1
lamp 27.7
inside 27.6
window 26.9
sink 26.8
tile 25.8
decoration 25.3
living 24.6
wall 24.4
style 23.7
toilet 23.7
3d 23.2
kitchen 23.1
sofa 23.1
indoor 22.8
light 22.7
residential 22
comfortable 22
estate 21.9
domestic 21.7
wood 21.7
faucet 21.7
contemporary 21.6
bath 20.9
clean 20.9
counter 18.5
mirror 18.1
cabinet 17.3
nobody 17.1
chair 17.1
basin 17
washbasin 16.9
shower 16.7
vase 16.6
elegance 16
glass 15.8
carpet 14.6
wash 14.5
expensive 14.4
lifestyle 13.7
new 13
appliance 12.7
space 12.4
hotel 12.4
rendering 12.4
dining 12.4
bathtub 12.2
render 12.1
stylish 11.8
relaxation 11.7
designer 11.6
empty 11.2
minimalism 10.9
mansion 10.8
tiles 10.7
stainless 10.7
marble 10.6
cooking 10.5
fashionable 10.4
vessel 10.3
rest 10.2
residence 9.9
pillow 9.7
decorating 9.7
plant 9.7
ceramic 9.7
tub 9.3
relax 9.3
decorative 9.2
washstand 9.2
towel 9.1
furnishing 8.9
tap 8.9
ceiling 8.8
oven 8.8
chairs 8.8
living room 8.8
elegant 8.6
office 8.6
real 8.5
hygiene 8.5
seat 8.1
fixture 8.1
desk 8.1
stove 8.1
structure 8.1
bedroom 8
loft 7.9
door 7.8
luxurious 7.8
improvement 7.8
property 7.7
couch 7.7
lighting 7.7
bed 7.6
container 7.6
shelf 7.5
bowl 7.5
curtain 7.3
day 7.1
wooden 7

Google
created on 2022-02-18

Microsoft
created on 2022-02-18

text 97.1
indoor 95.8
table 91.4
furniture 84.5
vase 61.1
black and white 58.7
kitchen appliance 13.5

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Male, 83.9%
Happy 93.8%
Calm 1.6%
Fear 1.3%
Confused 1.1%
Sad 0.7%
Disgusted 0.6%
Angry 0.6%
Surprised 0.5%

AWS Rekognition

Age 40-48
Gender Male, 100%
Sad 85.3%
Happy 6.9%
Confused 2.6%
Disgusted 1.4%
Surprised 1.1%
Fear 1.1%
Calm 0.9%
Angry 0.7%

AWS Rekognition

Age 38-46
Gender Female, 95.6%
Calm 91.2%
Happy 6.3%
Sad 1.6%
Confused 0.3%
Disgusted 0.3%
Surprised 0.1%
Fear 0.1%
Angry 0.1%

AWS Rekognition

Age 22-30
Gender Female, 96.2%
Happy 32.7%
Sad 29%
Calm 22.9%
Confused 5.3%
Fear 4.1%
Angry 2.3%
Disgusted 2.2%
Surprised 1.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 97.5%

Captions

Microsoft

a person standing in front of a mirror 54.3%
a person standing in front of a mirror posing for the camera 50.8%
a person standing in front of a window 50.7%

Text analysis

Amazon

3
YT3RAS
MAGON