Human Generated Data

Title

[Tomas and Andreas Feininger with Two Women and a Car]

Date

1950's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.137

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Tomas and Andreas Feininger with Two Women and a Car]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1950's

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Person 99.8
Human 99.8
Person 99.6
Person 99.6
Person 98.9
Vehicle 86.3
Car 86.3
Automobile 86.3
Transportation 86.3
Standing 75
People 74.5
Outdoors 69.4
Face 63.6
Nature 60.6
Duel 56

Imagga
created on 2021-12-13

kin 38.6
man 36.9
beach 30.5
sunset 29.6
people 29
sea 28.9
water 26
couple 25.2
summer 25.1
child 24.8
male 24.7
love 23.7
world 23.6
outdoors 23.6
silhouette 22.3
sky 22.3
ocean 20.9
sun 20.7
person 20.5
together 20.1
vacation 19.6
travel 18.3
lake 17.4
family 16.9
father 16.4
happy 15.7
fun 15
boy 14.8
walk 14.3
park 14
sand 13.8
lifestyle 13.7
parent 13.6
holiday 13.6
landscape 13.4
day 13.3
happiness 13.3
tropical 12.8
two 12.7
relax 12.6
coast 12.6
dad 12.5
romantic 12.5
tourism 12.4
spectator 12.1
men 12
mother 11.8
autumn 11.4
life 11.4
fall 10.9
smiling 10.8
outdoor 10.7
groom 10.5
shore 10.4
walking 10.4
light 10.1
relaxation 10
leisure 10
portrait 9.7
group 9.7
standing 9.6
dusk 9.5
sunny 9.5
season 9.3
waves 9.3
joy 9.2
adult 9.1
holding 9.1
human 9
active 9
river 8.9
seaside 8.7
togetherness 8.5
vacations 8.5
coastline 8.5
relationship 8.4
evening 8.4
exercise 8.2
horizon 8.1
son 8.1
romance 8
brother 7.9
pier 7.8
serenity 7.8
husband 7.7
fishing 7.7
wife 7.6
sibling 7.6
kids 7.5
wood 7.5
sunrise 7.5
sunshine 7.5
holidays 7.5
reflection 7.4
calm 7.3
peace 7.3
children 7.3
tree 7.2
smile 7.1
trees 7.1
grass 7.1
daughter 7.1
little 7.1
shoreline 7

Google
created on 2021-12-13

Microsoft
created on 2021-12-13

clothing 96.5
text 96.4
person 95
tree 91.6
outdoor 89.4
standing 89
man 78.7
fog 64.4
woman 58.6
picture frame 9.7

Face analysis

Amazon

Google

AWS Rekognition

Age 23-35
Gender Male, 63.6%
Calm 91.2%
Sad 4.5%
Happy 2.2%
Confused 1.7%
Angry 0.2%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 26-40
Gender Male, 98.5%
Happy 87.6%
Calm 10.2%
Sad 1.2%
Confused 0.5%
Surprised 0.3%
Angry 0.1%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 23-35
Gender Male, 79.7%
Fear 36.3%
Sad 20%
Confused 15.7%
Calm 11.7%
Angry 7.5%
Happy 6.6%
Surprised 1.2%
Disgusted 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Feature analysis

Amazon

Person 99.8%
Car 86.3%

Captions

Microsoft

a group of people standing in front of a window 90%
a group of people standing in front of a window posing for the camera 85.2%
a group of people posing for a photo in front of a window 85.1%