Human Generated Data

Title

Candids by Bradford Bachrach

Date

1975

People

Artist: Bradford Bachrach, American active 1912 - 1993

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.430

Human Generated Data

Title

Candids by Bradford Bachrach

People

Artist: Bradford Bachrach, American active 1912 - 1993

Date

1975

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.430

Machine Generated Data

Tags

Amazon
created on 2019-11-11

Person 99.5
Human 99.5
Person 98.1
Person 97.4
Person 97.2
Priest 96.3
Person 96.2
Bishop 95.1
Person 93.8
Person 92.6
Art 86.4
Painting 86.4
Person 76.8

Clarifai
created on 2019-11-11

people 99.6
wear 99.5
woman 98.6
adult 97.9
group 97.6
print 97.1
painting 95.8
illustration 95.7
dress 95.5
art 94.8
fashion 94.5
no person 94.3
one 94
outfit 93.6
man 93.4
two 93.1
many 91.1
veil 90.8
leader 90.7
four 89.9

Imagga
created on 2019-11-11

old 20.2
blackboard 19.1
money 18.7
currency 17.9
paper 16.5
cash 16.5
drawing 16.2
man 16.1
art 16.1
finance 15.2
sketch 14.8
retro 14.7
structure 14.7
bank 14.3
bill 14.3
financial 14.2
vintage 14.1
business 14
banking 12.9
person 12.3
book jacket 12.3
people 11.7
black 11.4
wall 11.1
design 10.8
symbol 10.8
envelope 10.7
altar 10.7
male 10.6
home 10.4
grunge 10.2
billboard 10
wealth 9.9
jacket 9.7
close 9.7
dollars 9.7
representation 9.4
dollar 9.3
face 9.2
note 9.2
letter 9.2
frame 9.2
banknote 8.7
antique 8.7
ancient 8.6
profit 8.6
exchange 8.6
card 8.6
child 8.5
portrait 8.4
head 8.4
savings 8.4
economy 8.3
clock 8.3
sign 8.3
investment 8.2
message 8.2
one 8.2
pattern 8.2
office 8
decoration 8
sculpture 8
bills 7.8
blank 7.7
stained 7.7
payment 7.7
culture 7.7
covering 7.5
house 7.5
happy 7.5
object 7.3
adult 7.3
signboard 7.3
wall clock 7.3
wrapping 7.3
stamp 7.2
architecture 7

Google
created on 2019-11-11

Microsoft
created on 2019-11-11

wall 99.8
gallery 99.5
room 97.7
dress 97.3
indoor 96.1
scene 95.6
wedding dress 95.2
clothing 93
text 87.9
person 84.8
bride 84.6
woman 83.1
posing 82.8
white 64.5
wedding 56.2
picture frame 9.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Female, 54.9%
Happy 54.2%
Calm 45.1%
Angry 45.1%
Surprised 45.4%
Fear 45%
Disgusted 45.1%
Confused 45.1%
Sad 45%

AWS Rekognition

Age 21-33
Gender Female, 54.2%
Calm 54.5%
Surprised 45%
Angry 45.2%
Sad 45.1%
Happy 45.1%
Confused 45%
Disgusted 45%
Fear 45%

AWS Rekognition

Age 7-17
Gender Female, 54.8%
Surprised 45.6%
Happy 52.2%
Confused 45.3%
Angry 45.2%
Disgusted 46.2%
Sad 45.1%
Calm 45.3%
Fear 45.3%

AWS Rekognition

Age 18-30
Gender Male, 53.2%
Angry 47.8%
Sad 48.8%
Calm 46.5%
Confused 45.2%
Fear 45.8%
Surprised 45.1%
Happy 45.5%
Disgusted 45.3%

AWS Rekognition

Age 21-33
Gender Female, 54.3%
Sad 47.6%
Disgusted 45%
Angry 45%
Confused 45%
Fear 45.1%
Happy 52.1%
Surprised 45%
Calm 45.1%

AWS Rekognition

Age 13-25
Gender Female, 51.3%
Sad 46.2%
Disgusted 45.2%
Angry 46.8%
Confused 45.2%
Fear 45.5%
Happy 49.6%
Surprised 45.2%
Calm 46.2%

AWS Rekognition

Age 29-45
Gender Female, 54.5%
Fear 45.3%
Surprised 45.2%
Calm 48.7%
Angry 46.3%
Sad 46.8%
Confused 45.2%
Happy 47.1%
Disgusted 45.4%

AWS Rekognition

Age 29-45
Gender Female, 53.8%
Fear 45.1%
Sad 45.1%
Disgusted 45.3%
Confused 45.1%
Surprised 45.1%
Angry 54.1%
Happy 45.1%
Calm 45.1%

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Painting 86.4%

Categories

Imagga

paintings art 99.9%