Human Generated Data

Title

Düsseldorf Family IV 1975

Date

1975

People

Artist: Candida Höfer, German born 1944

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Purchase through the generosity of the German Friends of the Busch-Reisinger Museum, 2019.129

Copyright

© Candida Höfer / Artists Rights Society (ARS), New York

Human Generated Data

Title

Düsseldorf Family IV 1975

People

Artist: Candida Höfer, German born 1944

Date

1975

Classification

Photographs

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Purchase through the generosity of the German Friends of the Busch-Reisinger Museum, 2019.129

Copyright

© Candida Höfer / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2023-01-11

Screen 100
Electronics 100
Computer Hardware 100
Hardware 100
TV 100
Portrait 99.8
Photography 99.8
Face 99.8
Head 99.8
Person 99.4
Baby 98.4
Person 98.4
Monitor 98.3
Person 97.1
Child 97.1
Male 97.1
Boy 97.1
Table 82.3
Furniture 82.3
Person 78.5
Dining Table 74.6
Indoors 72.5
Rug 59.3
Home Decor 59.3
Building 57.9
Architecture 57.9
Dining Room 57.9
Room 57.9
Living Room 57.4
Plant 56.3
Potted Plant 56.3
Cutlery 55.8

Clarifai
created on 2023-10-13

people 99.9
child 98.8
monochrome 98.8
two 97.5
room 97.4
portrait 96.8
documentary 96.7
one 95
family 94.1
furniture 93.9
analogue 93.7
vintage 93.3
street 92.7
adult 92.4
music 92
group 91.2
nostalgia 91
retro 90.5
woman 89.9
girl 89.6

Imagga
created on 2023-01-11

shop 26
television 23.3
person 21.5
adult 18.8
people 18.4
happy 18.2
telecommunication system 17.8
man 16.1
mother 15.6
smiling 15.2
home 14.4
interior 14.1
restaurant 14.1
family 13.3
mercantile establishment 12.5
store 12.3
child 12.2
building 12.1
old 11.8
portrait 11.6
business 11.5
retail 11.4
table 11.4
window 10.9
smile 10.7
stall 10.7
indoors 10.5
holiday 10
attractive 9.8
male 9.6
women 9.5
counter 9.2
city 9.1
vintage 9.1
fashion 9
parent 8.9
room 8.7
ancient 8.6
work 8.6
sitting 8.6
house 8.4
place of business 8.3
shopping 8.3
indoor 8.2
girls 8.2
style 8.2
food 8
looking 8
decoration 8
job 8
lifestyle 7.9
hair 7.9
urban 7.9
couple 7.8
black 7.8
gift 7.7
hand 7.7
customer 7.6
casual 7.6
clothing 7.5
one 7.5
coffee 7.4
toyshop 7.4
retro 7.4
occupation 7.3
worker 7.3
kitchen 7.2
working 7.1
architecture 7

Google
created on 2023-01-11

Microsoft
created on 2023-01-11

wall 95.8
text 93
clothing 92.9
person 92.7
human face 92.3
indoor 92.1
baby 84.7
black and white 84.1
toddler 80.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-7
Gender Female, 100%
Sad 96.9%
Calm 44.5%
Fear 6.9%
Surprised 6.4%
Angry 0.7%
Disgusted 0.6%
Confused 0.2%
Happy 0.2%

AWS Rekognition

Age 2-8
Gender Male, 92.3%
Fear 95.7%
Surprised 13.1%
Sad 2.3%
Calm 1.1%
Confused 0.5%
Angry 0.5%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 6-12
Gender Female, 100%
Fear 96.8%
Surprised 8.9%
Sad 2.8%
Calm 0.7%
Confused 0.3%
Happy 0.2%
Disgusted 0.2%
Angry 0.1%

Microsoft Cognitive Services

Age 6
Gender Male

Microsoft Cognitive Services

Age 5
Gender Male

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Baby 98.4%
Monitor 98.3%
Child 97.1%
Male 97.1%
Boy 97.1%
Rug 59.3%

Categories

Imagga

paintings art 58.2%
people portraits 41.3%

Text analysis

Amazon

8025
the
the and
and

Google

8025 V
8025
V