Human Generated Data

Title

Untitled (county fair, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.843

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (county fair, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.843

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Indoors 100
Restaurant 100
Food 100
Food Court 100
Cafeteria 100
Adult 99.2
Female 99.2
Person 99.2
Woman 99.2
Adult 99.1
Female 99.1
Person 99.1
Woman 99.1
Dining Table 98.9
Furniture 98.9
Table 98.9
Adult 98.9
Female 98.9
Person 98.9
Woman 98.9
Cafe 98.8
Person 98.3
Baby 98.3
Meal 98.2
Person 96.5
Person 96
Person 95.6
Person 95
Lunch 95
Person 94.2
Person 93.3
Person 92.1
Face 91.7
Head 91.7
Photography 91.7
Portrait 91.7
Person 91.4
Chair 91.3
Person 89.9
Person 89.7
Architecture 88.8
Building 88.8
Dining Room 88.8
Room 88.8
Person 87.6
Person 79
Dish 72.6
Person 72.4
Person 61.6
Bowl 57.4
Diner 55.8

Clarifai
created on 2018-05-11

people 100
group 99.4
adult 99.2
group together 98.7
many 98
administration 97.1
several 96.7
man 94.9
woman 94.8
leader 94.3
furniture 93.2
wear 90.8
vehicle 90.2
recreation 88.9
war 88.8
sit 88.6
two 86.6
chair 86.5
child 85.4
military 84.9

Imagga
created on 2023-10-07

man 35.7
computer 34.5
people 32.9
laptop 30.3
person 28.7
male 28.6
adult 25.4
business 24.9
working 24.7
seller 23.8
work 23.5
home 22.3
sitting 22.3
technology 22.3
happy 21.9
office 20.9
indoors 20.2
education 19
center 17.5
businesswoman 17.3
keyboard 16.9
looking 16.8
smiling 16.6
desk 16.4
student 16.3
together 15.8
group 15.3
lifestyle 15.2
learning 15
smile 15
professional 14.8
indoor 14.6
men 14.6
communication 14.3
classroom 14.2
job 14.2
table 14.1
school 13.8
portrait 13.6
casual 13.6
women 13.4
studying 13.4
room 13.2
meeting 13.2
couple 13.1
study 13.1
class 12.5
businesspeople 12.3
senior 12.2
notebook 12.1
executive 12
attractive 11.9
two 11.9
team 11.6
newspaper 11.6
worker 11.6
businessman 11.5
college 11.4
crossword puzzle 11.3
corporate 11.2
paper 11
scholar 10.8
engineer 10.6
using 10.6
intellectual 10.6
elderly 10.5
old 10.4
mature 10.2
teamwork 10.2
interior 9.7
colleagues 9.7
retired 9.7
monitor 9.6
university 9.6
wireless 9.5
happiness 9.4
face 9.2
sofa 9.2
hand 9.1
modern 9.1
cheerful 8.9
puzzle 8.9
child 8.8
talking 8.6
friends 8.5
screen 8.4
pretty 8.4
house 8.4
color 8.3
camera 8.3
book 8.2
children 8.2
teacher 8.1
lady 8.1
suit 8.1
building 8.1
success 8
30s 7.7
pen 7.7
retirement 7.7
workplace 7.6
reading 7.6
world 7.6
relax 7.6
sit 7.6
clothing 7.5
fun 7.5
manager 7.4
holding 7.4
document 7.4
glasses 7.4
product 7.4
alone 7.3
blond 7.3
bright 7.1
life 7.1
game 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 94.5
dining table 7.2

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 68-78
Gender Female, 99.1%
Happy 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Disgusted 0.1%
Confused 0%
Calm 0%

AWS Rekognition

Age 26-36
Gender Female, 98.4%
Sad 99.8%
Calm 16.1%
Happy 7.2%
Surprised 7.1%
Fear 6.2%
Confused 1.9%
Disgusted 1.5%
Angry 0.4%

AWS Rekognition

Age 6-12
Gender Male, 90.8%
Fear 39.7%
Calm 37.6%
Angry 11.7%
Surprised 6.8%
Sad 4.7%
Happy 4.6%
Confused 2.4%
Disgusted 1.9%

Microsoft Cognitive Services

Age 80
Gender Male

Feature analysis

Amazon

Adult 99.2%
Female 99.2%
Person 99.2%
Woman 99.2%
Baby 98.3%
Chair 91.3%