Human Generated Data

Title

Untitled (window display of Fleisher Yarn products)

Date

1939

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3955

Human Generated Data

Title

Untitled (window display of Fleisher Yarn products)

People

Artist: Durette Studio, American 20th century

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3955

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Figurine 96.3
Human 94.5
Person 94.5
Person 86.1
Person 83.2
Person 82.7
Shop 82.3
Person 76.2
Person 75.8
Mannequin 74.5
Window Display 74.1
Museum 74
Person 74
Toy 64.7
Person 62.1
Person 49.9
Person 45.6

Clarifai
created on 2019-06-01

people 99.8
group 98.1
furniture 97.7
adult 96.4
many 95.1
administration 95
leader 94.5
group together 93.9
man 93.8
woman 93.1
room 92.8
chair 91.3
several 90.3
sit 88.5
home 87.1
indoors 83
desk 82.6
wear 79.2
monochrome 79
two 78.9

Imagga
created on 2019-06-01

altar 34.1
salon 27.6
table 27.3
room 26.3
interior 23.9
structure 21.7
people 21.2
home 20.7
case 20.3
indoors 20.2
person 18.2
women 17.4
shop 16.3
man 16.1
male 15.6
sitting 15.5
house 15
chair 15
restaurant 14.8
decoration 14.5
lifestyle 14.4
celebration 14.3
adult 13.9
happy 13.8
men 13.7
indoor 13.7
party 12.9
two 12.7
happiness 12.5
smiling 12.3
holiday 12.2
inside 12
furniture 11.3
couple 11.3
luxury 11.1
love 11
glass 10.9
smile 10.7
family 10.7
modern 10.5
pretty 10.5
gift 10.3
coffee 10.2
wedding 10.1
leisure 10
window 9.9
professional 9.9
decor 9.7
new 9.7
hotel 9.5
color 9.4
elegant 9.4
dinner 9.4
mercantile establishment 9.2
drink 9.2
portrait 9.1
classroom 9
desk 8.8
teacher 8.8
together 8.8
light 8.7
life 8.7
work 8.6
design 8.4
elegance 8.4
old 8.4
service 8.3
holding 8.2
style 8.2
dress 8.1
cheerful 8.1
kitchen 8
lunch 7.7
bride 7.7
apartment 7.7
drinking 7.6
comfortable 7.6
counter 7.6
relaxation 7.5
tea 7.5
floor 7.4
barbershop 7.4
food 7.4
baron 7.2
groom 7.1
office 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

wall 98.2
indoor 96.5
person 82.7
vase 81.2
flower 75
black and white 73.8
clothing 61.6
posing 42.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 17-27
Gender Female, 50.5%
Angry 49.6%
Happy 49.6%
Confused 49.5%
Calm 49.5%
Disgusted 49.8%
Sad 49.9%
Surprised 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Disgusted 49.5%
Calm 50.4%
Angry 49.5%
Sad 49.5%
Happy 49.5%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Surprised 49.6%
Sad 49.8%
Happy 49.7%
Angry 49.6%
Disgusted 49.5%
Confused 49.6%
Calm 49.8%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Angry 49.6%
Happy 49.6%
Calm 49.8%
Surprised 49.6%
Sad 49.9%
Confused 49.5%
Disgusted 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.4%
Confused 49.6%
Surprised 49.5%
Sad 49.9%
Angry 49.5%
Happy 49.5%
Calm 49.9%
Disgusted 49.5%

AWS Rekognition

Age 17-27
Gender Female, 50.5%
Angry 49.6%
Sad 49.6%
Disgusted 49.6%
Happy 50.1%
Calm 49.5%
Surprised 49.6%
Confused 49.5%

Feature analysis

Amazon

Person 94.5%

Categories

Text analysis

Amazon

FLEISHER
FLEISHER YARNS
YARNS
Piccadilly
Sritish Piccadilly
Sritish
ThmrE

Google

British as Piccadilly FLEISHER FLEISHER YARNS
British
as
Piccadilly
FLEISHER
YARNS