Human Generated Data

Title

Untitled (three men playing cards)

Date

1944

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1607

Human Generated Data

Title

Untitled (three men playing cards)

People

Artist: John Deusing, American active 1940s

Date

1944

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1607

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.2
Human 99.2
Person 98.6
Drawing 95.6
Art 95.6
Person 94.5
Sketch 83.6
Clinic 71.6

Clarifai
created on 2023-10-15

people 99.9
group 99
adult 98.8
group together 97.7
man 97.2
woman 96.8
three 94.3
administration 94.1
monochrome 93.9
furniture 93.6
child 93.2
room 92.5
four 90.8
sit 90.7
two 89.8
education 89.6
leader 89.3
several 89.2
five 88.5
indoors 86.3

Imagga
created on 2021-12-14

man 40.3
office 37.4
people 36.3
person 35.9
male 34.1
adult 33.7
laptop 33.1
computer 32.9
business 32.2
room 30.2
meeting 30.2
businessman 30
nurse 29.4
indoors 29
table 27.8
businesswoman 26.4
sitting 25.8
happy 25.7
professional 25.5
work 25.1
home 23.9
smiling 23.9
working 23.9
businesspeople 23.7
together 23.7
barbershop 23.4
women 22.9
team 22.4
group 21.8
teamwork 21.3
corporate 20.6
men 20.6
worker 20.6
shop 20.2
executive 18.9
talking 18.1
desk 17.4
lifestyle 17.4
casual 17
patient 16.9
modern 16.8
colleagues 16.5
couple 15.7
mercantile establishment 15.2
smile 15
technology 14.8
successful 14.6
sofa 14.5
workplace 14.3
interior 14.2
job 14.2
happiness 14.1
education 13.9
indoor 13.7
communication 13.4
hospital 13.3
classroom 13.1
looking 12.8
mid adult 12.5
discussion 11.7
two people 11.7
portrait 11.7
30s 11.5
place of business 11.4
success 11.3
manager 11.2
two 11
suit 10.8
living room 10.8
teacher 10.6
company 10.2
document 10.2
relaxation 10.1
discussing 9.8
life 9.8
attractive 9.8
specialist 9.7
comfortable 9.6
inside 9.2
house 9.2
sick person 9.1
case 9.1
student 8.9
casual clothing 8.8
center 8.8
conference 8.8
couch 8.7
boss 8.6
living 8.5
face 8.5
career 8.5
color 8.3
leisure 8.3
occupation 8.3
20s 8.2
relaxing 8.2
cheerful 8.1
television 8
collaboration 7.9
paper 7.8
employee 7.8
conversation 7.8
reading 7.6
bed 7.6
adults 7.6
salon 7.5
senior 7.5
study 7.5
presentation 7.4
clothing 7.4
window 7.3
confident 7.3
chair 7.2
day 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.8
person 92.7
clothing 89.7
table 87.2
furniture 75.3
man 64.8
drawing 59.5
old 41.9
clothes 17.7

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-42
Gender Female, 50.7%
Calm 88%
Happy 7.7%
Sad 2.6%
Surprised 0.9%
Confused 0.3%
Angry 0.3%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 28-44
Gender Female, 52.4%
Calm 93.3%
Happy 3.2%
Sad 1.1%
Angry 0.9%
Surprised 0.6%
Confused 0.5%
Disgusted 0.3%
Fear 0.2%

AWS Rekognition

Age 32-48
Gender Female, 51.9%
Calm 58.2%
Sad 35.1%
Fear 3.2%
Confused 1.9%
Angry 0.9%
Surprised 0.5%
Happy 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Text analysis

Amazon

M 117
YT37A2
M 117 YT37A2 АЗДА
АЗДА