Human Generated Data

Title

Untitled (two elderly men holding children)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16952

Human Generated Data

Title

Untitled (two elderly men holding children)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16952

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Person 98.5
Human 98.5
Person 98.5
Person 97.7
Person 95.6
Interior Design 94.5
Indoors 94.5
Furniture 91.9
Tie 90.3
Accessories 90.3
Accessory 90.3
Home Decor 89.6
Room 89.3
People 86.1
Shoe 81.4
Footwear 81.4
Clothing 81.4
Apparel 81.4
Shoe 79.2
Lamp 75
Living Room 74.9
Table Lamp 68.8
Portrait 61.9
Photography 61.9
Face 61.9
Photo 61.9
Shoe 61.1
Chair 59.6
Linen 57.5
Monitor 55.4
Electronics 55.4
Screen 55.4
Display 55.4

Clarifai
created on 2023-10-29

people 99.7
man 97.8
adult 97.1
group 96
child 95
woman 94.7
two 93.7
monochrome 86.2
family 84.4
group together 83.5
leader 83.4
sit 82.3
wear 81.9
three 81.7
indoors 81.3
chair 80.3
furniture 79.8
veil 78.7
music 76.7
administration 76.6

Imagga
created on 2022-02-26

dishwasher 29.8
people 27.3
man 22.8
white goods 22.5
person 21.7
adult 19.5
male 18.5
home appliance 18.2
men 15.4
appliance 13.6
television 13.3
lifestyle 13
black 12.6
business 12.1
interior 11.5
human 11.2
home 11.2
couple 10.4
women 10.3
love 10.2
smiling 10.1
happy 10
blackboard 9.9
work 9.4
symbol 9.4
equipment 9.2
art 9.2
hand 9.1
holding 9.1
sport 9.1
portrait 9
professional 9
working 8.8
sitting 8.6
world 8.6
life 8.5
relax 8.4
musical instrument 8.3
worker 8.1
indoors 7.9
happiness 7.8
two 7.6
togetherness 7.5
house 7.5
leisure 7.5
silhouette 7.4
event 7.4
flag 7.4
water 7.3
dress 7.2
transportation 7.2
shop 7.1
family 7.1
job 7.1
child 7
sky 7
together 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 96.1
old 56.5
curtain 55
furniture 54.2
clothing 50.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 9-17
Gender Male, 98.8%
Calm 56.9%
Sad 32.7%
Happy 3.9%
Confused 1.9%
Fear 1.5%
Surprised 1.3%
Disgusted 1.1%
Angry 0.8%

AWS Rekognition

Age 49-57
Gender Male, 97.4%
Happy 80.3%
Disgusted 8.2%
Sad 4.2%
Calm 2.8%
Confused 1.7%
Surprised 1.4%
Angry 0.9%
Fear 0.5%

AWS Rekognition

Age 47-53
Gender Male, 81%
Calm 94.5%
Sad 4.2%
Happy 0.5%
Angry 0.3%
Confused 0.2%
Disgusted 0.1%
Fear 0.1%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Tie
Shoe
Person 98.5%
Person 98.5%
Person 97.7%
Person 95.6%
Tie 90.3%
Shoe 81.4%
Shoe 79.2%
Shoe 61.1%

Categories

Imagga

paintings art 83.5%
interior objects 13.4%
text visuals 2.7%

Text analysis

Amazon

a
MJIR
MJIR YT33AS In
YT33AS
In

Google

MJI7 YT3RA2 00 A
MJI7
YT3RA2
00
A