Human Generated Data

Title

Turks in Germany 1979

Date

1979

People

Artist: Candida Höfer, German born 1944

Classification

Audiovisual Works

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Francis H. Burr Memorial Fund, 2019.126

Copyright

© Candida Höfer / Artists Rights Society (ARS), New York

Human Generated Data

Title

Turks in Germany 1979

People

Artist: Candida Höfer, German born 1944

Date

1979

Classification

Audiovisual Works

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Francis H. Burr Memorial Fund, 2019.126

Copyright

© Candida Höfer / Artists Rights Society (ARS), New York

Machine Generated Data

Tags

Amazon
created on 2022-08-13

Person 99.6
Human 99.6
Interior Design 99.6
Indoors 99.6
Person 99.3
Person 99.3
Person 99.2
Art 89.5
Metropolis 78.3
Building 78.3
Urban 78.3
City 78.3
Town 78.3
Screen 64.8
Electronics 64.8
Art Gallery 61.4
Photography 60.3
Photo 60.3
Home Decor 57.9
Monitor 57.2
Display 57.2
Collage 56
Poster 56
Advertisement 56
Flooring 55.1

Clarifai
created on 2023-10-31

indoors 98.7
family 97
people 95.8
room 95.3
exhibition 94.6
television 93.5
man 92
art 91.5
museum 90.7
painting 89.8
technology 89
mirror 85.4
wall 85.2
contemporary 85.2
light 83.7
woman 82.7
furniture 80.7
bedroom 80.2
interior design 80
window 79.9

Imagga
created on 2022-08-13

laptop 33.3
computer 33.3
television 31.6
binder 30.2
happy 30.1
smiling 27.5
protective covering 24.6
business 24.3
person 23.7
technology 21.5
people 21.2
office 21.1
working 20.3
adult 20.1
home 19.1
man 18.8
covering 18.6
smile 17.1
holding 15.7
male 14.9
child 14.7
sitting 14.6
indoors 14.1
telecommunication system 14
pretty 14
businesswoman 13.6
attractive 13.3
looking 12.8
work 12.6
happiness 12.5
executive 12.2
lady 12.2
hand 12.1
expression 11.9
desk 11.9
notebook 11.8
device 11.8
communication 11.8
portrait 11.6
wireless 11.4
cheerful 11.4
book 11
box 10.7
room 10.5
board 10.3
professional 10.3
student 10
family 9.8
worker 9.8
container 9.6
broadcasting 9.5
corporate 9.5
senior 9.4
manager 9.3
face 9.2
browsing 8.8
businessman 8.8
paper 8.7
women 8.7
sofa 8.7
display 8.6
mature 8.4
joy 8.4
friendly 8.2
one 8.2
screen 8.1
success 8
brunette 7.8
education 7.8
blank 7.7
money 7.7
career 7.6
fun 7.5
successful 7.3
confident 7.3
product 7.2
support 7.2
telecommunication 7.1
information 7.1
monitor 7.1

Google
created on 2022-08-13

Microsoft
created on 2022-08-13

wall 97.3
computer 95.5
person 93.1
television 93
art 90
human face 85.5
screenshot 81.6
gallery 72.6
laptop 71.7
screen 71.5
man 69.6
text 66.4
design 66.3
room 65.5
clothing 64.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 100%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 66.4%
Sad 47.5%
Surprised 6.8%
Fear 6.4%
Confused 1.9%
Disgusted 1.8%
Angry 1.2%
Happy 0.5%

AWS Rekognition

Age 45-51
Gender Male, 100%
Calm 98.7%
Surprised 6.4%
Fear 5.9%
Sad 2.2%
Confused 0.6%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 2-10
Gender Female, 97.9%
Calm 97.1%
Surprised 6.4%
Fear 6.2%
Sad 2.3%
Confused 0.7%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 42
Gender Male

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 7
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

interior objects 97.8%
food drinks 1.6%