Human Generated Data

Title

Untitled (group portrait of ten member family in living room decorated for Christmas)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9159

Human Generated Data

Title

Untitled (group portrait of ten member family in living room decorated for Christmas)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9159

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.6
Human 99.6
Person 99.2
Person 98.5
Person 98.2
Person 98
Person 96.4
Person 96
Person 91.4
Person 87.1
Face 79.7
People 79.5
Wheel 79.2
Machine 79.2
Car 71.5
Automobile 71.5
Vehicle 71.5
Transportation 71.5
Portrait 65.7
Photography 65.7
Photo 65.7
Tree 61.2
Plant 61.2
Urban 59.7
Road 58.1

Clarifai
created on 2023-10-26

people 100
adult 99.6
group 98.9
child 98.7
man 98
vehicle 98
woman 97.6
group together 97.3
transportation system 97.2
monochrome 97.1
offspring 93.9
recreation 93.7
sitting 93.4
many 92.8
several 92.4
sit 91
nostalgia 90.4
chair 90.3
administration 87.9
leader 86.9

Imagga
created on 2022-01-23

person 38.5
man 38.3
wheelchair 33.8
people 30.7
adult 26.2
chair 26.1
room 23.4
male 22.8
patient 22.1
men 21.5
senior 20.6
motor scooter 18.3
classroom 18.3
teacher 16.2
smiling 15.9
lifestyle 15.9
seat 15.9
women 15.8
indoors 15.8
wheeled vehicle 15.8
vehicle 15.7
sitting 15.5
happy 15
case 14.1
health 13.9
group 13.7
sick person 13.4
couple 13.1
mature 13
training 12.9
medical 12.4
outdoors 11.9
student 11.9
casual 11.9
working 11.5
portrait 11
work 11
hospital 10.9
worker 10.7
nurse 10.6
class 10.6
retirement 10.6
elderly 10.5
together 10.5
school 10.5
sport 10.5
looking 10.4
home 10.4
professional 10.4
active 10.3
team 9.8
business 9.7
retired 9.7
boy 9.6
education 9.5
furniture 9.4
smile 9.3
exercise 9.1
care 9
students 8.8
gym 8.6
conveyance 8.5
guy 8.4
old 8.4
city 8.3
one 8.2
teenager 8.2
job 8
businessman 7.9
disabled 7.9
determination 7.8
athlete 7.8
hands 7.8
equipment 7.8
concentration 7.7
exercising 7.7
youth 7.7
illness 7.6
talking 7.6
hand 7.6
college 7.6
strength 7.5
inside 7.4
cheerful 7.3
office 7.2
day 7.1
happiness 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

text 98.2
road 97.2
clothing 94.2
person 87.3
man 71.4
land vehicle 70.3
vehicle 67.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 58.5%
Sad 90.5%
Happy 8.1%
Calm 0.6%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Fear 0.2%
Surprised 0.1%

AWS Rekognition

Age 27-37
Gender Male, 98.7%
Calm 72.4%
Happy 24.1%
Surprised 1.2%
Disgusted 0.8%
Confused 0.7%
Sad 0.4%
Fear 0.3%
Angry 0.2%

AWS Rekognition

Age 45-53
Gender Male, 99.8%
Happy 64.4%
Calm 23.6%
Confused 6.8%
Sad 2.1%
Surprised 1.7%
Disgusted 0.9%
Angry 0.4%
Fear 0.2%

AWS Rekognition

Age 43-51
Gender Male, 99.5%
Happy 68.2%
Sad 15.5%
Confused 8.6%
Calm 3.5%
Disgusted 1.6%
Surprised 1.2%
Fear 0.9%
Angry 0.6%

AWS Rekognition

Age 31-41
Gender Male, 98.1%
Happy 82.2%
Calm 11.6%
Sad 4%
Confused 0.7%
Surprised 0.6%
Disgusted 0.4%
Fear 0.3%
Angry 0.3%

AWS Rekognition

Age 33-41
Gender Male, 76.9%
Calm 40.5%
Confused 15.8%
Sad 13.4%
Happy 11.9%
Surprised 5.8%
Fear 5%
Angry 4.6%
Disgusted 3.1%

AWS Rekognition

Age 6-16
Gender Female, 96.9%
Calm 99.2%
Sad 0.7%
Confused 0%
Angry 0%
Happy 0%
Disgusted 0%
Surprised 0%
Fear 0%

AWS Rekognition

Age 51-59
Gender Male, 98.8%
Confused 38.5%
Happy 37%
Calm 10.8%
Sad 5.8%
Disgusted 3.3%
Surprised 2.2%
Fear 1.5%
Angry 1%

AWS Rekognition

Age 27-37
Gender Male, 98%
Sad 90.5%
Calm 7.7%
Happy 0.5%
Fear 0.4%
Confused 0.3%
Angry 0.3%
Surprised 0.2%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Wheel 79.2%
Car 71.5%

Text analysis

Amazon

13150
MJI
MJI BEEF
BEEF

Google

315 () MJA YT33A 02MA
315
()
MJA
YT33A
02MA