Human Generated Data

Title

Untitled (man standing in front of room lecturing to group of men in chairs)

Date

c. 1950

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14324

Human Generated Data

Title

Untitled (man standing in front of room lecturing to group of men in chairs)

People

Artist: Jack Gould, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.14324

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.1
Human 99.1
Clothing 98.6
Apparel 98.6
Person 98.2
Person 97.8
Sitting 97.3
Suit 96.2
Overcoat 96.2
Coat 96.2
Person 93.8
Person 92.1
Tie 91
Accessories 91
Accessory 91
Person 89
Furniture 85.6
Person 84.2
Chair 83.1
Tuxedo 82.3
Person 81.9
Crowd 76
Tie 74.5
People 72.4
Female 69.8
Person 67.2
Person 66.3
Table 64.3
Photography 62.1
Photo 62.1
Indoors 59.6
Room 55.5

Clarifai
created on 2023-10-27

people 99.7
group 99.1
man 99
adult 97.4
group together 97.3
league 95.4
meeting 94.8
leader 94.6
chair 94
monochrome 93.6
actor 90
woman 89.4
sit 87.9
partnership 86.9
many 85.5
seminar 84.9
several 83
corporate 83
indoors 80.4
office 78.6

Imagga
created on 2022-01-29

person 40
man 38.4
people 37.4
male 33.4
professional 27.3
adult 26.3
office 24.9
businessman 23.8
worker 22.5
business 20.7
happy 20.1
team 19.7
home 19.1
nurse 18.9
life 18.9
smiling 18.8
indoors 18.5
men 18
meeting 17.9
patient 17.5
table 17.3
sitting 17.2
room 16.8
couple 16.6
work 16.5
businesswoman 16.4
lifestyle 15.9
computer 15.2
working 15
teacher 14.6
laptop 14.6
group 14.5
executive 14.3
women 14.2
businesspeople 14.2
job 14.2
medical 14.1
teamwork 13.9
talking 13.3
desk 13.2
cheerful 13
corporate 12.9
happiness 12.5
together 12.3
smile 12.1
two 11.9
coat 11.3
doctor 11.3
health 11.1
portrait 11
indoor 11
new 10.5
modern 10.5
senior 10.3
holding 9.9
salon 9.8
conference 9.8
human 9.7
interior 9.7
laboratory 9.6
mother 9.6
mature 9.3
communication 9.2
successful 9.2
family 8.9
associates 8.9
hospital 8.8
clinic 8.8
medicine 8.8
kin 8.8
two people 8.8
lab 8.7
partner 8.7
30s 8.7
day 8.6
casual 8.5
attractive 8.4
pretty 8.4
manager 8.4
presentation 8.4
hand 8.4
occupation 8.3
care 8.2
confident 8.2
board 8.1
clothing 8.1
success 8
hairdresser 8
love 7.9
teaching 7.8
education 7.8
planner 7.8
colleagues 7.8
waiter 7.8
test 7.7
partnership 7.7
biology 7.6
student 7.6
instrument 7.4
chair 7.3
suit 7.2
handsome 7.1
idea 7.1
to 7.1

Google
created on 2022-01-29

Coat 90.6
Suit 81.3
Vintage clothing 72.1
Event 71.3
Monochrome 67.9
Service 66.1
White-collar worker 65.6
Room 64.4
History 64.2
Monochrome photography 63.5
Sitting 62.8
Team 62.7
Stock photography 62.4
Art 61.7
Font 60.2
Classic 57.4
Crew 55

Microsoft
created on 2022-01-29

text 96.2
person 95.5
clothing 92.4
man 89.6
old 67.1
posing 59.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 33-41
Gender Female, 80.4%
Calm 99.3%
Sad 0.5%
Confused 0.1%
Disgusted 0%
Angry 0%
Happy 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 18-24
Gender Female, 74.8%
Calm 97%
Sad 1.4%
Surprised 0.5%
Angry 0.4%
Disgusted 0.2%
Confused 0.2%
Happy 0.1%
Fear 0.1%

AWS Rekognition

Age 47-53
Gender Male, 91.7%
Calm 79%
Sad 19.3%
Confused 0.7%
Fear 0.3%
Surprised 0.3%
Disgusted 0.2%
Angry 0.2%
Happy 0.1%

AWS Rekognition

Age 49-57
Gender Female, 89.1%
Calm 98.4%
Sad 0.6%
Confused 0.5%
Surprised 0.1%
Angry 0.1%
Happy 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 38-46
Gender Male, 81.7%
Calm 53.4%
Happy 41.4%
Sad 2.3%
Angry 0.9%
Disgusted 0.6%
Fear 0.5%
Surprised 0.4%
Confused 0.4%

AWS Rekognition

Age 38-46
Gender Female, 79.1%
Calm 99.6%
Sad 0.2%
Happy 0.1%
Confused 0.1%
Angry 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 41-49
Gender Female, 97.9%
Calm 59.8%
Sad 20.5%
Surprised 9.2%
Fear 4.4%
Confused 1.9%
Happy 1.5%
Angry 1.4%
Disgusted 1.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99.1%
Person 98.2%
Person 97.8%
Person 93.8%
Person 92.1%
Person 89%
Person 84.2%
Person 81.9%
Person 67.2%
Person 66.3%
Tie 91%
Tie 74.5%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

a
MJIR
YT33A2
MJIR YT33A2 محجم
محجم