Human Generated Data

Title

[Andreas and Tomas Feininger]

Date

1930's

People

Artist: Lyonel Feininger, American 1871 - 1956

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.78

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Human Generated Data

Title

[Andreas and Tomas Feininger]

People

Artist: Lyonel Feininger, American 1871 - 1956

Date

1930's

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of T. Lux Feininger, BRLF.437.78

Copyright

© Artists Rights Society (ARS), New York / VG Bild-Kunst, Bonn

Machine Generated Data

Tags

Amazon
created on 2021-12-13

Person 99.3
Human 99.3
Face 78.8
Advertisement 76.9
Art 75.2
Poster 74
Furniture 69.3
Text 66
Clothing 62.3
Apparel 62.3
Photo 62.2
Photography 62.2
Portrait 60.2
Sitting 57

Clarifai
created on 2023-10-15

people 99.9
adult 99.3
two 99.2
one 98.2
man 97.8
room 95.7
child 93.9
three 93.6
furniture 93.3
leader 92.5
administration 92.2
sit 92
portrait 90.5
group 90.1
family 89.2
four 88.5
home 86.9
medical practitioner 86.6
chair 84.9
interaction 84.8

Imagga
created on 2021-12-13

person 29.8
man 26.9
male 24.2
people 23.4
old 19.5
happy 16.9
room 15.5
elderly 15.3
adult 15.1
senior 15
negative 14.2
office 13.8
aged 13.6
blackboard 13.4
portrait 12.9
business 12.8
working 12.4
looking 12
casual 11.9
film 11.8
businessman 11.5
standing 11.3
home 11.2
men 11.2
wall 11.1
grunge 11.1
smiling 10.8
lifestyle 10.8
holding 10.7
smile 10.7
one 10.4
nurse 10.2
work 10.2
house 10
hand 9.9
patient 9.8
indoors 9.7
retirement 9.6
hair 9.5
happiness 9.4
world 9
worker 9
computer 8.8
retired 8.7
antique 8.7
sitting 8.6
child 8.5
school 8.5
face 8.5
professional 8.5
alone 8.2
teacher 8.1
job 8
education 7.8
color 7.8
ancient 7.8
call 7.8
corporate 7.7
shirt 7.7
health 7.6
reading 7.6
outdoors 7.5
mature 7.4
camera 7.4
life 7.4
retro 7.4
classroom 7.3
occupation 7.3
cheerful 7.3
lady 7.3
building 7.3
sick person 7.2
gray 7.2
family 7.1
case 7.1

Google
created on 2021-12-13

Microsoft
created on 2021-12-13

text 93.1
man 92.1
clothing 90.9
human face 90.2
person 88.2
white 72.8
old 72.5
black 72.4
drawing 62.7
black and white 59.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 20-32
Gender Male, 87.3%
Calm 77.9%
Happy 8.7%
Sad 4.9%
Disgusted 2.6%
Fear 2.1%
Angry 1.5%
Confused 1.5%
Surprised 0.7%

AWS Rekognition

Age 26-40
Gender Female, 78.3%
Calm 56%
Happy 36.9%
Sad 5.6%
Surprised 0.4%
Angry 0.4%
Confused 0.3%
Fear 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft
created on 2021-12-13

an old photo of a man 89.5%
a man standing in front of a window 78%
old photo of a man 77.9%