Human Generated Data

Title

[Julia Feininger and Walter and Ise Gropius]

Date

1940s-1950s

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.134

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Julia Feininger and Walter and Ise Gropius]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1940s-1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.1011.134

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2023-10-25

Photography 100
Face 100
Head 100
Portrait 100
People 99.9
Person 99.5
Boy 99.5
Male 99.5
Teen 99.5
Clothing 99
Coat 99
Person 98.8
Male 98.8
Adult 98.8
Man 98.8
Person 97.1
Formal Wear 97
Shirt 96.7
Hat 92.5
Electrical Device 88
Microphone 88
Plant 85.1
Tree 85.1
Accessories 84.6
Tie 84.6
Suit 81.8
Outdoors 77.9
Grass 65.3
Photographer 57.6
Blazer 56.8
Jacket 56.8
Electronics 56.7
Cap 56.6
Nature 56.5
Vegetation 56.2
Baseball Cap 55.6
Camera 55.6
Video Camera 55.6
Overcoat 55.4
Vest 55.1

Clarifai
created on 2023-10-15

people 99.8
monochrome 99.7
portrait 99.3
two 98.2
man 97.7
adult 96.2
documentary 93
black and white 91.3
sepia 89.7
child 89.4
woman 89
affection 88.9
war 88.5
group 88
administration 87.6
wear 86.9
three 86.8
wedding 86.1
interaction 85.4
facial expression 83.9

Imagga
created on 2019-01-31

man 31.6
male 26.4
person 24.6
people 22.3
black 19.2
couple 18.3
adult 16.9
love 15.8
portrait 15.5
men 15.4
bow tie 12.6
silhouette 12.4
musical instrument 11.9
happy 11.9
kin 11.6
human 10.5
attractive 10.5
hair 10.3
smiling 10.1
clothing 10.1
holding 9.9
day 9.4
happiness 9.4
necktie 9.2
hand 9.1
business 9.1
romance 8.9
style 8.9
wind instrument 8.7
child 8.7
groom 8.6
sax 8.6
marriage 8.5
face 8.5
youth 8.5
casual 8.5
power 8.4
dark 8.3
sexy 8
looking 8
lifestyle 7.9
women 7.9
brunette 7.8
smile 7.8
pretty 7.7
expression 7.7
world 7.6
two 7.6
brass 7.3
music 7.3
success 7.2
sunset 7.2

Google
created on 2019-01-31

Microsoft
created on 2019-01-31

man 92.4
person 91.9
window 86.8
old 72.3
posing 42.1
image 39
black and white 20.3
music 15.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 47-53
Gender Female, 96.6%
Happy 32.3%
Sad 24.6%
Calm 19.1%
Angry 10.2%
Surprised 9.7%
Fear 7.7%
Confused 7.1%
Disgusted 2.7%

AWS Rekognition

Age 36-44
Gender Female, 100%
Calm 99.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Happy 0.1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%

AWS Rekognition

Age 49-57
Gender Female, 59%
Angry 36.9%
Calm 28%
Surprised 13.4%
Fear 12.6%
Happy 5.4%
Disgusted 3.2%
Sad 3%
Confused 1.9%

Microsoft Cognitive Services

Age 39
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Boy 99.5%
Male 99.5%
Teen 99.5%
Adult 98.8%
Man 98.8%
Suit 81.8%

Categories

Imagga

interior objects 61.5%
paintings art 34.5%
food drinks 3.6%