Human Generated Data

Title

[People sitting at outside table]

Date

1933

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.247.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[People sitting at outside table]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1933

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.247.5

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2019-11-19

Nature 99.7
Outdoors 99.2
Snow 94.8
Person 89.1
Human 89.1
Clothing 84
Apparel 84
Winter 83.6
Brick 74.6
Wall 69.1
Female 67
Snowman 66.2
Pants 61.4
Banister 56.9
Handrail 56.9
Shorts 55.4

Clarifai
created on 2019-11-19

people 99.7
adult 96.8
one 96.6
wear 94.3
man 92.9
two 92.9
group together 91.5
street 90.8
monochrome 88.9
woman 88.4
music 84.9
group 84.4
portrait 82.6
art 80.9
musician 78
three 77.9
leader 77.8
retro 75.7
administration 74.6
child 74.2

Imagga
created on 2019-11-19

violin 30.2
bowed stringed instrument 27.5
stringed instrument 21.9
old 20.9
grunge 20.4
dirty 18.1
wall 17.2
musical instrument 17.1
building 16
black 15
person 14.9
snow 13.6
city 13.3
vintage 13.2
street 12.9
negative 12.5
portrait 12.3
urban 12.2
man 11.4
detail 11.3
people 11.2
barbershop 11
dress 10.8
weathered 10.5
texture 10.4
art 10.4
cold 10.3
paint 10
aged 10
adult 9.8
stone 9.4
architecture 9.4
winter 9.4
lady 8.9
style 8.9
ancient 8.6
decoration 8.6
film 8.5
house 8.4
dark 8.4
shop 8.3
fashion 8.3
pattern 8.2
rough 8.2
device 8
male 7.9
scene 7.8
antique 7.8
culture 7.7
damaged 7.6
grungy 7.6
wood 7.5
human 7.5
outdoors 7.5
design 7.3
color 7.2
surface 7.1
travel 7

Google
created on 2019-11-19

Microsoft
created on 2019-11-19

text 97.4
black and white 87.6
person 82.9
clothing 80.6
drawing 79.3
white 61.3
human face 57.8
woman 57.2
wedding dress 52.7

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-50
Gender Female, 50.4%
Happy 45.3%
Fear 47.6%
Angry 45.7%
Calm 46.6%
Sad 49.2%
Disgusted 45.2%
Confused 45.2%
Surprised 45.2%

AWS Rekognition

Age 28-44
Gender Female, 53.2%
Surprised 45.2%
Angry 45.2%
Happy 46.1%
Fear 45.3%
Calm 46.3%
Confused 45.7%
Sad 51%
Disgusted 45.2%

Feature analysis

Amazon

Person 89.1%

Captions