Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

Date

1939

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10923

Human Generated Data

Title

Untitled (group portrait of ten member family sitting in living room)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1939

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10923

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.1
Human 99.1
Person 98.8
Person 98.4
Person 98.4
Person 97.7
Person 97.6
Interior Design 97.1
Indoors 97.1
Furniture 95.1
Room 94.7
Person 93.3
Clothing 93.1
Apparel 93.1
Person 91.4
Person 91.1
People 87.3
Living Room 82.8
Crowd 82.4
Chair 77
Person 75.2
Table 75
Face 72.8
Screen 69.2
Electronics 69.2
Monitor 68.7
Display 68.7
Portrait 65.7
Photography 65.7
Photo 65.7
Audience 64.4
Tie 63.6
Accessories 63.6
Accessory 63.6
Dining Table 62.5
Floor 62.2
Suit 60.9
Coat 60.9
Overcoat 60.9
Female 60.6
Sitting 59.2
Sideboard 57.5
Flooring 56.9

Clarifai
created on 2023-10-29

people 99.6
group 97.8
group together 97.1
man 96.4
woman 96.4
many 94.7
child 92.9
education 90.3
adult 90.2
indoors 87.6
leader 82.7
uniform 82.6
crowd 77.5
recreation 76.6
chair 75.7
school 75.7
wear 73.2
boy 72.9
monochrome 70.1
league 69.9

Imagga
created on 2022-02-05

person 40
man 39
waiter 36.9
people 36.8
male 34.1
room 32.1
nurse 30.8
adult 29.4
meeting 29.2
businessman 29.1
professional 28.8
business 28.6
worker 27
office 26.7
dining-room attendant 26.5
businesswoman 25.5
smiling 25.3
men 24.9
women 24.5
couple 24.4
employee 24
teacher 23.3
happy 23.2
together 22.8
team 22.4
sitting 22.3
group 21.8
table 21.6
indoors 21.1
talking 20.9
businesspeople 20.9
colleagues 19.4
work 18.8
mature 18.6
home 18.3
teamwork 17.6
corporate 17.2
executive 16.2
job 15.9
portrait 15.5
desk 15.1
modern 14.7
cheerful 14.6
classroom 14.6
indoor 14.6
educator 14.4
communication 14.3
working 14.1
student 13.8
20s 13.7
lifestyle 13.7
smile 13.5
happiness 13.3
conference 12.7
40s 12.7
two 11.9
hospital 11.7
discussion 11.7
30s 11.5
interior 11.5
senior 11.2
casual 11
laptop 10.9
coworkers 10.8
casual clothing 10.8
holding 10.7
four 10.5
education 10.4
manager 10.2
camera 10.2
successful 10.1
suit 9.9
associates 9.8
medical 9.7
success 9.7
mid adult 9.6
computer 9.6
standing 9.6
restaurant 9.5
color 9.5
doctor 9.4
patient 9.4
presentation 9.3
life 9.3
new 8.9
boardroom 8.9
four people 8.9
discussing 8.8
cooperation 8.7
partnership 8.6
day 8.6
boss 8.6
clothing 8.6
chair 8.5
enjoying 8.5
hall 8.5
study 8.4
horizontal 8.4
drink 8.4
wine 8.3
family 8
looking 8
class 7.7
health 7.6
ethnic 7.6
togetherness 7.6
friends 7.5
friendship 7.5
board 7.5
document 7.4
occupation 7.3
confident 7.3
to 7.1
glass 7

Google
created on 2022-02-05

Black 89.7
Black-and-white 84.4
Style 83.9
Window 79.7
Vintage clothing 76.4
Monochrome photography 76.3
Monochrome 75.4
Building 75.1
Picture frame 75
Snapshot 74.3
Event 73.6
Curtain 72.2
Classic 71.9
Uniform 65.6
Room 65.4
Suit 64.8
Team 64.8
Child 64.7
Stock photography 64.7
Sitting 60.8

Microsoft
created on 2022-02-05

wedding dress 99.4
bride 98.6
person 95.9
wedding 94.8
dress 91.5
woman 89.4
text 86.4
clothing 85.3
man 69.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 47-53
Gender Male, 80.8%
Happy 81.2%
Calm 14.4%
Sad 1.6%
Fear 1%
Confused 0.7%
Surprised 0.5%
Angry 0.4%
Disgusted 0.3%

AWS Rekognition

Age 48-56
Gender Male, 87.8%
Calm 59.1%
Sad 16%
Surprised 13.7%
Disgusted 3.3%
Happy 2.6%
Angry 2.2%
Confused 2.1%
Fear 1.1%

AWS Rekognition

Age 43-51
Gender Male, 100%
Sad 84.4%
Calm 6.3%
Fear 4%
Confused 1.7%
Surprised 1.2%
Angry 1.1%
Disgusted 0.6%
Happy 0.5%

AWS Rekognition

Age 33-41
Gender Male, 99.7%
Calm 92.4%
Happy 3.9%
Sad 2.3%
Disgusted 0.4%
Angry 0.4%
Confused 0.3%
Surprised 0.2%
Fear 0.2%

AWS Rekognition

Age 40-48
Gender Female, 98.7%
Calm 90.3%
Happy 5.2%
Sad 3.4%
Angry 0.3%
Surprised 0.2%
Disgusted 0.2%
Confused 0.2%
Fear 0.2%

AWS Rekognition

Age 35-43
Gender Male, 89.5%
Happy 34.7%
Sad 20.6%
Calm 17.6%
Surprised 12.7%
Fear 4.9%
Disgusted 3.4%
Angry 3.3%
Confused 2.9%

AWS Rekognition

Age 54-62
Gender Male, 97.4%
Calm 97.6%
Sad 1.2%
Disgusted 0.4%
Confused 0.3%
Surprised 0.2%
Fear 0.1%
Happy 0.1%
Angry 0.1%

AWS Rekognition

Age 20-28
Gender Female, 93.7%
Calm 55.6%
Sad 36.7%
Happy 3.5%
Confused 1.4%
Angry 1%
Disgusted 0.7%
Surprised 0.6%
Fear 0.5%

AWS Rekognition

Age 25-35
Gender Male, 86.7%
Calm 96.7%
Sad 2.6%
Disgusted 0.2%
Angry 0.1%
Happy 0.1%
Fear 0.1%
Confused 0.1%
Surprised 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Person 99.1%
Person 98.8%
Person 98.4%
Person 98.4%
Person 97.7%
Person 97.6%
Person 93.3%
Person 91.4%
Person 91.1%
Person 75.2%
Tie 63.6%

Categories

Imagga

interior objects 97.2%
paintings art 1.9%

Text analysis

Amazon

АЗДА
MJ3YY33A2 АЗДА
MJ3YY33A2