Human Generated Data

Title

[People sitting at outside table]

Date

1933

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.247.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People sitting at outside table]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1933

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.247.4

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Human 94.1
Furniture 91.3
Chair 91.3
Person 85.9
Face 84.2
Person 80.6
Apparel 80.1
Clothing 80.1
Nature 76.6
Snow 76.6
Outdoors 76.6
Snowman 76.6
Winter 76.6
Sitting 65.5
Photo 61.1
Photography 61.1
Female 57.8
Flooring 56.5
Sleeve 55.3

Clarifai
created on 2019-11-19

people 99.8
adult 99.1
one 98.7
wear 97.4
two 95.8
woman 94.4
man 93.6
veil 91.7
group 89.5
portrait 87.7
group together 85.2
musician 84.5
music 84.2
street 84
outfit 81.2
three 79.9
retro 77.1
leader 77
child 71
administration 70.1

Imagga
created on 2019-11-19

wall 18
dress 17.2
old 16
man 15.5
person 15.1
portrait 14.9
tool 14.6
building 13.9
groom 13.8
people 13.4
mother 11.3
body 11.2
city 10.8
sexy 10.4
art 10.4
male 10.3
wheeled vehicle 10.3
snow 10.2
happiness 10.2
dirty 9.9
vintage 9.9
adult 9.8
human 9.7
lady 9.7
black 9.6
ancient 9.5
hair 9.5
parent 9.4
grunge 9.4
water 9.3
model 9.3
traditional 9.1
alone 9.1
attractive 9.1
child 8.8
couple 8.7
love 8.7
lifestyle 8.7
architecture 8.6
lawn mower 8.5
fashion 8.3
shovel 8.2
outdoors 8.2
girls 8.2
sensuality 8.2
home 8
urban 7.9
face 7.8
space 7.8
cold 7.7
travel 7.7
tree 7.7
bride 7.7
stone 7.6
room 7.5
decoration 7.4
street 7.4
detail 7.2
weather 7.1

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 96.8
person 93.4
outdoor 89.2
black and white 88.9
clothing 87
wedding dress 85.4
woman 75.5
white 69.2
bride 68.5
human face 65.6
drawing 63.6
old 58
dress 56.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-42
Gender Female, 50.2%
Disgusted 45.1%
Calm 52.9%
Sad 46.4%
Confused 45.1%
Fear 45.1%
Surprised 45.1%
Happy 45.3%
Angry 45.2%

AWS Rekognition

Age 65-79
Gender Female, 52.2%
Surprised 2.6%
Angry 4.9%
Disgusted 1.8%
Calm 12.1%
Confused 3.3%
Fear 1.5%
Sad 26.3%
Happy 47.6%

Feature analysis

Amazon

Person 85.9%
Snowman 76.6%

Categories