Human Generated Data

Title

Untitled (montage with portraits)

Date

1984

People

Artist: Vaughn Sills, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3197

Copyright

© Vaughn Sills

Human Generated Data

Title

Untitled (montage with portraits)

People

Artist: Vaughn Sills, American 20th century

Date

1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3197

Copyright

© Vaughn Sills

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.4
Text 94.4
Indoors 83.3
Art 74.4
Drawing 70.4
Room 70.1
Canvas 69.5
Person 66.2
Furniture 65.5
Desk 65.5
Table 65.5
Sitting 64.5
Sketch 62.6
Handwriting 56.1

Clarifai
created on 2023-10-25

people 99.8
portrait 99.3
adult 99.3
monochrome 99.3
man 98.8
art 97.3
two 97
one 95.9
painting 92.3
woman 91.8
music 89
wear 89
street 87.5
group 87.1
facial expression 86.9
painter 85.5
concentration 83.9
boy 83
chair 80.1
easel 79.9

Imagga
created on 2022-01-09

laptop 48.6
computer 45.9
office 44.4
working 37.1
business 36.5
notebook 33.4
person 31.4
work 31.4
adult 29.2
people 29
happy 27.6
businesswoman 27.3
professional 26.5
corporate 25.8
attractive 25.2
job 23
desk 23
painter 22.6
executive 22.4
smiling 21.7
smile 21.4
pretty 21
technology 20.8
sitting 20.6
confident 20
suit 19.8
worker 17.8
looking 17.6
one 17.2
portrait 16.8
businessman 16.8
man 16.1
home 16
manager 15.8
table 15.6
face 15.6
student 15.5
success 15.3
black 15.1
women 15
indoors 14.9
wireless 14.3
businesspeople 14.2
career 14.2
male 14.2
keyboard 14.1
brunette 13.9
alone 13.7
communication 13.4
education 13
lady 13
sexy 12.9
20s 12.8
casual 12.7
portable computer 12.7
cheerful 12.2
successful 11.9
using 11.6
modern 11.2
room 11.2
lifestyle 10.8
employee 10.5
formal 10.5
workplace 10.5
support 10.3
happiness 10.2
model 10.1
indoor 10
fashion 9.8
personal computer 9.6
boss 9.6
hair 9.5
pen 9.4
youth 9.4
study 9.3
phone 9.2
book 9
call 8.9
interior 8.8
stringed instrument 8.8
staff 8.7
paper 8.6
cute 8.6
clothing 8.5
musical instrument 8.5
shirt 8.4
studio 8.4
hand 8.4
service 8.3
fun 8.2
friendly 8.2
secretary 8.1
group 8.1
handsome 8
standing 7.8
typing 7.8
men 7.7
studying 7.7
two 7.6
tie 7.6
jacket 7.5
one person 7.5
glasses 7.4
occupation 7.3

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

text 97.4
person 94.8
indoor 86.2
drawing 85.7
clothing 82.8
man 77.2
sketch 74.3
black and white 73.6
handwriting 71.1
human face 66.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-29
Gender Female, 94%
Calm 99.4%
Confused 0.1%
Happy 0.1%
Angry 0.1%
Sad 0.1%
Surprised 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 22-30
Gender Female, 66.9%
Calm 97.9%
Surprised 0.6%
Sad 0.6%
Angry 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

ats
fat
Nana
fat much Nana
much
Fold

Google

fort
Nind
fort Nind