Human Generated Data

Title

Untitled (woman and four men reading piece of paper while sitting on outdoor steps)

Date

1940-1960

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10027

Human Generated Data

Title

Untitled (woman and four men reading piece of paper while sitting on outdoor steps)

People

Artist: Martin Schweig, American 20th century

Date

1940-1960

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10027

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.8
Human 99.8
Clothing 99.4
Apparel 99.4
Person 99.2
Person 98.8
Person 96.5
Sitting 92.3
Person 89.7
Person 89.6
Furniture 83.9
Chair 83.6
Shoe 83
Footwear 83
Coat 78.1
Shoe 74.7
Shirt 72.3
Face 68.9
Portrait 68.6
Photography 68.6
Photo 68.6
Suit 68.5
Overcoat 68.5
Shorts 66.1
Leisure Activities 63.4
Table 61.1
Pants 59.6
Screen 58.8
Electronics 58.8
Floor 58.5
Monitor 57.1
Display 57.1
Musician 57
Musical Instrument 57
LCD Screen 55.9
Sleeve 55.8
Flooring 55.8

Clarifai
created on 2023-10-26

people 99.8
group 98.9
group together 98.8
man 97.7
adult 96.9
woman 95.2
child 88.7
education 86.5
three 86.5
five 85.9
four 85.7
several 85
leader 84.5
school 83
many 82.4
recreation 80.1
administration 77.7
actor 76.9
handshake 76.9
monochrome 76.9

Imagga
created on 2022-01-28

man 45.1
person 44.7
male 41.2
people 40.8
office 33.9
meeting 33
businessman 32.7
business 29.8
adult 29.6
sitting 28.4
group 28.2
table 26.9
professional 26.8
room 26.7
men 25.8
indoors 25.5
teacher 25
patient 24.7
smiling 24.6
home 23.9
happy 23.8
together 23.7
businesswoman 23.7
team 23.3
women 22.9
teamwork 22.3
executive 22.1
senior 21.6
working 21.2
work 21.2
talking 20.9
colleagues 20.4
businesspeople 19.9
corporate 19.8
couple 19.2
mature 18.6
job 18.6
computer 18.5
laptop 18.4
worker 17.2
desk 17.1
nurse 16
communication 16
indoor 15.5
modern 15.4
educator 15
smile 15
case 14.9
lifestyle 14.5
30s 14.4
classroom 14.3
portrait 13.6
sick person 13.1
cheerful 13
two people 12.6
suit 12.6
manager 12.1
presentation 12.1
success 12.1
looking 12
casual 11.9
hospital 11.8
coworkers 11.8
conference 11.7
mid adult 11.6
kin 11.4
technology 11.1
two 11
grandma 10.8
discussion 10.7
medical 10.6
ethnic 10.5
company 10.2
horizontal 10.1
associates 9.8
staff 9.8
40s 9.7
interior 9.7
retired 9.7
retirement 9.6
four 9.6
elderly 9.6
boss 9.6
workplace 9.5
career 9.5
color 9.5
happiness 9.4
friends 9.4
successful 9.2
student 9.1
lady 8.9
partners 8.8
leader 8.7
education 8.7
day 8.6
friendship 8.4
health 8.3
20s 8.3
collaboration 7.9
casual clothing 7.8
60s 7.8
businessmen 7.8
grandfather 7.7
busy 7.7
drinking 7.7
illness 7.6
hand 7.6
enjoying 7.6
plan 7.6
showing 7.5
restaurant 7.5
holding 7.4
coffee 7.4
occupation 7.3
confident 7.3
clinic 7.2
employee 7.2
face 7.1
to 7.1
attractive 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 99.5
window 97.3
text 94.2
clothing 93.2
man 92.7
footwear 53.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 43-51
Gender Female, 69.7%
Happy 85.5%
Calm 13.9%
Sad 0.2%
Surprised 0.1%
Angry 0.1%
Confused 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 45-51
Gender Female, 99.7%
Calm 33.6%
Sad 27.9%
Happy 17.1%
Surprised 13.7%
Confused 3.3%
Fear 2.2%
Disgusted 1.1%
Angry 1%

AWS Rekognition

Age 48-54
Gender Male, 99.8%
Sad 92.6%
Calm 3.4%
Happy 2%
Confused 0.8%
Surprised 0.4%
Disgusted 0.3%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 39-47
Gender Male, 99.1%
Happy 76.9%
Sad 13.6%
Surprised 3%
Calm 1.9%
Disgusted 1.4%
Confused 1.2%
Fear 1.2%
Angry 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Shoe
Person 99.8%
Person 99.2%
Person 98.8%
Person 96.5%
Person 89.7%
Person 89.6%
Shoe 83%
Shoe 74.7%

Categories

Imagga

people portraits 94.2%
events parties 5.2%

Text analysis

Amazon

2
MJ17--YT37A°--X

Google

MJI3--YT33A°2--XAON
MJI3--YT33A°2--XAON