Human Generated Data

Title

Untitled (women gathered around table with merchandise)

Date

1954

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1572

Human Generated Data

Title

Untitled (women gathered around table with merchandise)

People

Artist: John Deusing, American active 1940s

Date

1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1572

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.9
Human 98.9
Person 98.7
Person 97.5
Room 95.6
Indoors 95.6
Person 91.8
Interior Design 88.7
Clothing 88.1
Apparel 88.1
Furniture 87.9
Clinic 86
People 68.3
Screen 63.6
Electronics 63.6
Dressing Room 56.1
Bed 56

Clarifai
created on 2023-10-15

people 99.7
adult 97.7
man 96.2
monochrome 95.9
group 95.9
woman 91.3
group together 90.4
vehicle 87.5
three 86.3
several 85.6
two 84.2
war 83.8
administration 83.5
many 83.2
wear 81.7
leader 76.9
military 73.8
child 73.7
four 73.6
actor 67

Imagga
created on 2021-12-14

work 25.3
people 23.4
person 22.6
medical 21.2
professional 20.8
worker 20.7
business 19.4
technology 19.3
working 18.5
businessman 18.5
medicine 18.5
laboratory 18.3
man 17.8
biology 17.1
lab 16.5
chemistry 16.4
research 16.2
chemical 15.7
plaything 15.7
test 15.4
team 15.2
science 15.1
doctor 15
male 14.9
scientist 14.7
office 14.6
table 14.1
health 13.9
student 13.7
scientific 13.6
human 13.5
equipment 13.4
coat 12.8
glass 12.1
computer 12
adult 11.9
chemist 11.8
hand 11.4
businesspeople 11.4
education 11.3
looking 11.2
men 11.2
toy 11
biochemistry 10.8
assistant 10.7
colleagues 10.7
instrument 10.7
negative 10.5
modern 10.5
development 10.5
corporate 10.3
decoration 10.2
teamwork 10.2
suit 10.2
clinic 9.9
researcher 9.9
microscope 9.8
biotechnology 9.8
technician 9.8
party 9.4
film 9.4
drink 9.2
businesswoman 9.1
microbiology 8.9
job 8.8
happy 8.8
meeting 8.5
wedding 8.3
nurse 8.3
laptop 8.2
brass 7.9
observation 7.9
day 7.8
reception 7.8
optical 7.8
portrait 7.8
tube 7.7
career 7.6
manager 7.4
successful 7.3
indoor 7.3
group 7.2
smiling 7.2
celebration 7.2
device 7.1
information 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

person 98
text 91.6
clothing 90.7
people 81.1
old 79.3
group 70.3
wedding dress 59
preparing 47.6
cooking 36.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 19-31
Gender Female, 80.4%
Happy 63.5%
Sad 11.9%
Calm 8.8%
Surprised 6.9%
Fear 4.5%
Angry 2%
Confused 1.9%
Disgusted 0.5%

AWS Rekognition

Age 33-49
Gender Male, 79.7%
Calm 86.3%
Sad 12.7%
Confused 0.3%
Happy 0.3%
Angry 0.2%
Surprised 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 6-16
Gender Female, 80.3%
Calm 56.4%
Sad 30.8%
Surprised 5.7%
Happy 4.2%
Fear 1%
Confused 0.8%
Angry 0.8%
Disgusted 0.2%

AWS Rekognition

Age 23-37
Gender Male, 50.5%
Sad 64.5%
Angry 9%
Calm 8.4%
Surprised 8.2%
Confused 4.3%
Fear 3.1%
Happy 1.9%
Disgusted 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Categories

Imagga

paintings art 99.9%

Text analysis

Amazon

©FUNT
2.VEF