Human Generated Data

Title

[Formal group photo, Heidelberg]

Date

unknown

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.401.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Formal group photo, Heidelberg]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

unknown

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.401.3

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-04-05

Human 99.4
Person 99.4
Person 99.3
Person 99.1
Person 98.8
Person 96.5
Person 94
Person 93.8
Clothing 91.6
Apparel 91.6
Person 81
People 80.4
Person 79.5
Face 69.4
Person 66.8
Photo 62.9
Portrait 62.9
Photography 62.9
Clinic 60.7
Nurse 58.2
Crypt 57.5
Sailor Suit 56.4
Family 55.4

Clarifai
created on 2021-04-05

people 100
group 98.8
many 98.5
adult 98.1
group together 97.7
man 96.4
woman 96.3
child 94.5
dancing 94.1
leader 93.9
wear 92.6
dancer 91.2
music 90.9
administration 88.2
several 87.6
position 87.4
outfit 86.6
uniform 85.1
education 85
boy 82.5

Imagga
created on 2021-04-05

kin 31.1
nurse 21.5
old 20.9
person 20.4
people 19.5
religion 17
church 16.6
dress 16.2
groom 16.1
world 14.3
man 14.1
building 13.8
architecture 13.3
wall 12.8
love 11.8
art 11.8
history 11.6
statue 11.6
bride 11.5
fashion 11.3
antique 11.2
ancient 11.2
monument 11.2
tourism 10.7
altar 10.5
couple 10.4
portrait 10.3
room 10.3
happy 10
city 10
tourist 10
male 9.9
vintage 9.9
travel 9.9
catholic 9.8
cathedral 9.7
life 9.7
arch 9.7
adult 9.7
religious 9.4
wedding 9.2
scene 8.7
happiness 8.6
god 8.6
decoration 8.6
historical 8.5
clothing 8.5
stone 8.4
traditional 8.3
detail 8
student 7.9
holiday 7.9
holy 7.7
sculpture 7.6
marriage 7.6
historic 7.3
group 7.2
structure 7.2
black 7.2
blond 7.2
indoors 7

Google
created on 2021-04-05

Microsoft
created on 2021-04-05

person 96.2
wall 95.9
clothing 93.7
white 69.3
old 68.6
man 56.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-66
Gender Male, 90.2%
Calm 95.7%
Sad 1.5%
Surprised 1%
Confused 0.6%
Happy 0.5%
Angry 0.5%
Disgusted 0.2%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Male, 90%
Sad 90.1%
Calm 4.6%
Fear 1.8%
Angry 1.2%
Confused 1.1%
Happy 0.9%
Disgusted 0.2%
Surprised 0.1%

AWS Rekognition

Age 26-40
Gender Female, 56.9%
Calm 80.8%
Surprised 9.9%
Happy 4.3%
Angry 1.4%
Sad 1.2%
Confused 0.9%
Disgusted 0.9%
Fear 0.7%

AWS Rekognition

Age 40-58
Gender Female, 62.9%
Sad 57.6%
Calm 31.3%
Happy 6.2%
Confused 1.7%
Angry 1.6%
Fear 0.8%
Surprised 0.7%
Disgusted 0.2%

AWS Rekognition

Age 31-47
Gender Female, 98.8%
Sad 52.1%
Calm 23.5%
Angry 7.5%
Happy 7%
Fear 4.6%
Confused 3.1%
Surprised 1.5%
Disgusted 0.7%

AWS Rekognition

Age 40-58
Gender Male, 75.8%
Calm 56.5%
Happy 28.1%
Surprised 6.8%
Angry 3.1%
Fear 2.3%
Sad 2.2%
Confused 0.8%
Disgusted 0.4%

AWS Rekognition

Age 53-71
Gender Male, 71.5%
Calm 77.4%
Happy 9.8%
Fear 5.8%
Sad 4.6%
Surprised 0.9%
Angry 0.6%
Disgusted 0.5%
Confused 0.4%

AWS Rekognition

Age 42-60
Gender Female, 80%
Calm 49.5%
Happy 39.2%
Sad 6.4%
Angry 2.4%
Surprised 0.9%
Fear 0.6%
Disgusted 0.5%
Confused 0.5%

AWS Rekognition

Age 39-57
Gender Female, 77.3%
Calm 85.8%
Sad 4.6%
Happy 4.4%
Surprised 2.8%
Angry 1%
Fear 0.6%
Disgusted 0.5%
Confused 0.3%

AWS Rekognition

Age 38-56
Gender Female, 53.8%
Sad 57.2%
Calm 25%
Happy 6.6%
Surprised 4.8%
Angry 2.5%
Confused 1.9%
Fear 1.7%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Categories