Human Generated Data

Title

Untitled (bride and guests looking at wedding presents)

Date

c. 1970

People

Artist: Ken Whitmire Associates, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19821

Human Generated Data

Title

Untitled (bride and guests looking at wedding presents)

People

Artist: Ken Whitmire Associates, American

Date

c. 1970

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19821

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99
Apparel 99
Person 98.7
Human 98.7
Person 98.2
Person 97
Person 95.7
Indoors 89.5
Room 87.9
Gown 72.8
Fashion 72.8
Person 72.5
Robe 70
Suit 68.4
Coat 68.4
Overcoat 68.4
Furniture 66.4
Meal 64.7
Food 64.7
People 60
Wedding Gown 58.4
Wedding 58.4
Workshop 58.1
Living Room 57.1
Evening Dress 56
Person 45.4

Clarifai
created on 2023-10-22

people 99.6
monochrome 98
group 96.9
man 96.7
adult 96.1
room 93.4
many 91.4
group together 91
war 90.4
furniture 89.4
desk 87.7
woman 85.5
employee 85.3
grinder 85.2
industry 85
several 83.7
military 82.7
administration 82.4
sit 81.7
indoors 81.5

Imagga
created on 2022-03-05

computer 40
laptop 35.5
office 34.2
man 31.6
work 31.4
business 31
people 30.7
working 29.2
desk 27.2
person 27
male 25.5
home 23.9
indoors 23.7
table 23.2
corporate 22.3
happy 21.9
adult 21.9
professional 21.4
businessman 21.2
technology 20.8
sitting 20.6
smiling 20.3
executive 19.6
meeting 18.8
job 18.6
communication 17.6
lifestyle 17.3
businesswoman 17.3
room 17.2
interior 16.8
restaurant 15.9
modern 15.4
group 15.3
worker 15.2
smile 15
notebook 13.5
suit 13.5
chair 13.2
shop 13
device 12.9
men 12.9
salon 12.9
indoor 12.8
workplace 12.4
businesspeople 12.3
education 12.1
inside 12
classroom 11.9
women 11.9
cheerful 11.4
building 11.3
looking 11.2
casual 11
happiness 11
house 10.9
projector 10.6
structure 10.5
attractive 10.5
phone 10.1
glass 10.1
barbershop 9.9
team 9.9
success 9.7
equipment 9.6
wireless 9.5
teamwork 9.3
confident 9.1
portrait 9.1
handsome 8.9
employee 8.8
corporation 8.7
keyboard 8.6
talking 8.6
presentation 8.4
company 8.4
television 8.3
occupation 8.2
furniture 8.2
kitchen 8.1
patient 7.9
together 7.9
specialist 7.9
couple 7.8
colleagues 7.8
gesture 7.6
relax 7.6
dinner 7.6
career 7.6
adults 7.6
horizontal 7.5
senior 7.5
mercantile establishment 7.5
floor 7.4
mature 7.4
light 7.4
case 7.3

Google
created on 2022-03-05

Hat 84.9
Black-and-white 84.3
Automotive design 83.7
Monochrome photography 71.9
T-shirt 71.5
Cooking 68.5
Monochrome 67.8
Event 67.2
Art 66.6
Motor vehicle 65.1
Machine 62.9
Metal 62.6
Sitting 58.9
Room 58.3
Street 53.3
Still life photography 53.2
Job 52.5
Cap 51.9
Suit 50.4
Table 50.3

Microsoft
created on 2022-03-05

text 97.4
clothing 91.9
indoor 91.6
person 91.5
black and white 82.9
man 74.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 43-51
Gender Male, 97.8%
Calm 99.7%
Surprised 0.2%
Sad 0.1%
Disgusted 0%
Confused 0%
Happy 0%
Fear 0%
Angry 0%

AWS Rekognition

Age 29-39
Gender Male, 99.9%
Calm 89.9%
Angry 4.6%
Happy 2.4%
Confused 1.1%
Surprised 0.9%
Disgusted 0.6%
Sad 0.3%
Fear 0.2%

AWS Rekognition

Age 29-39
Gender Female, 99.1%
Calm 31.3%
Sad 30%
Surprised 23.8%
Fear 5.2%
Happy 5%
Angry 2.7%
Disgusted 1.3%
Confused 0.6%

AWS Rekognition

Age 47-53
Gender Female, 99.2%
Calm 97.4%
Confused 0.8%
Surprised 0.6%
Sad 0.4%
Fear 0.3%
Happy 0.2%
Angry 0.2%
Disgusted 0.1%

Feature analysis

Amazon

Person
Person 98.7%
Person 98.2%
Person 97%
Person 95.7%
Person 72.5%
Person 45.4%

Text analysis

Amazon

EXIT
18

Google

EXIT -YT3RA°2--XÁGOX
EXIT
-YT3RA°2--XÁGOX